Deep Character-Level Neural Machine Translation By Learning Morphology
A new method for NMT by Shenjian Zhao in Shanghai Jiao Tong University and Zhihua Zhang in Peking University
- Goal: NMT aims at building a single large neural network that can be trained to maximize translation preformance.
- Problem: The use of large vocabulary becomes the bottleneck in both training and improving the performance.
- Methods: Two recurrent networks and a hierarchical decoder which translates at character level.
- Advantages: It avoids the large vocabulary issue radically; It is more efficient in training than word-based models.
- Experiments: Higer BLEU and learn more morphology.