Exploiting cross-sentence context for neural machine translation

134Citations
Citations of this article
200Readers
Mendeley users who have this article in their library.

Abstract

In translation, considering the document as a whole can help to resolve ambiguities and inconsistencies. In this paper, we propose a cross-sentence context-aware approach and investigate the influence of historical contextual information on the performance of neural machine translation (NMT). First, this history is summarized in a hierarchical way. We then integrate the historical representation into NMT in two strategies: 1) a warm-start of encoder and decoder states, and 2) an auxiliary context source for updating decoder states. Experimental results on a large Chinese-English translation task show that our approach significantly improves upon a strong attention-based NMT system by up to +2.1 BLEU points.

Cite

CITATION STYLE

APA

Wang, L., Tu, Z., Way, A., & Liu, Q. (2017). Exploiting cross-sentence context for neural machine translation. In EMNLP 2017 - Conference on Empirical Methods in Natural Language Processing, Proceedings (pp. 2826–2831). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/d17-1301

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free