Abstract
Neural machine translation systems with subword vocabularies are capable of translating or copying unknown words. In this work, we show that they learn to copy words based on both the context in which the words appear as well as features of the words themselves. In contexts that are particularly copy-prone, they even copy words that they have already learned they should translate. We examine the influence of context and subword features on this and other types of copying behavior.
Cite
CITATION STYLE
Knowles, R., & Koehn, P. (2018). Context and copying in neural machine translation. In Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing, EMNLP 2018 (pp. 3034–3041). Association for Computational Linguistics. https://doi.org/10.18653/v1/d18-1339
Register to see more suggestions
Mendeley helps you to discover research relevant for your work.