Neural machine translation with key-value memory-augmented attention

14Citations
Citations of this article
39Readers
Mendeley users who have this article in their library.

Abstract

Although attention-based Neural Machine Translation (NMT) has achieved remarkable progress in recent years, it still suffers from issues of repeating and dropping translations. To alleviate these issues, we propose a novel key-value memory-augmented attention model for NMT, called KVMEMATT. Specifically, we maintain a timely updated key-memory to keep track of attention history and a fixed value-memory to store the representation of source sentence throughout the whole translation process. Via nontrivial transformations and iterative interactions between the two memories, the decoder focuses on more appropriate source word(s) for predicting the next target word at each decoding step, therefore can improve the adequacy of translations. Experimental results on ChineseEnglish and WMT17 GermanEnglish translation tasks demonstrate the superiority of the proposed model.

Cite

CITATION STYLE

APA

Meng, F., Tu, Z., Cheng, Y., Wu, H., Zhai, J., Yang, Y., & Wang, D. (2018). Neural machine translation with key-value memory-augmented attention. In IJCAI International Joint Conference on Artificial Intelligence (Vol. 2018-July, pp. 2574–2580). International Joint Conferences on Artificial Intelligence. https://doi.org/10.24963/ijcai.2018/357

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free