One of the weaknesses of Neural Machine Translation (NMT) is in handling low-frequency and ambiguous words, which we refer as troublesome words. To address this problem, we propose a novel memory-enhanced NMT method. First, we investigate different strategies to define and detect the troublesome words. Then, a contextual memory is constructed to memorize which target words should be produced in what situations. Finally, we design a hybrid model to dynamically access the contextual memory so as to correctly translate the troublesome words. The extensive experiments on Chinese-to-English and English-to-German translation tasks demonstrate that our method significantly outperforms the strong baseline models in translation quality, especially in handling troublesome words.
CITATION STYLE
Zhao, Y., Zhang, J., He, Z., Zong, C., & Wu, H. (2018). Addressing troublesome words in neural machine translation. In Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing, EMNLP 2018 (pp. 391–400). Association for Computational Linguistics. https://doi.org/10.18653/v1/d18-1036
Mendeley helps you to discover research relevant for your work.