Document-Level Neural Machine Translation With Recurrent Context States

6Citations
Citations of this article
13Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

Integrating contextual information into sentence-level neural machine translation (NMT) systems has been proven to be effective in generating fluent and coherent translations. However, taking too much context into account slows down these systems, especially when context-aware models are applied to the decoder side. To improve efficiency, we propose a simple and fast method to encode all sentences in an arbitrary large context window. It makes contextual representations in the process of translating each sentence so that the overhead introduced by the context model is almost negligible. We experiment with our method on three widely used English-German document-level translation datasets, which obtain substantial improvements over the sentence-level baseline with almost no loss in efficiency. Moreover, our method also achieves comparable performance with previous strong context-aware baselines and speeds up the inference by $1.53\times $. The speed-up is even larger when more contexts are taken into account. On the ContraPro pronoun translation dataset, it significantly outperforms the strong baseline.

Cite

CITATION STYLE

APA

Zhao, Y., & Liu, H. (2023). Document-Level Neural Machine Translation With Recurrent Context States. IEEE Access, 11, 27519–27526. https://doi.org/10.1109/ACCESS.2023.3247508

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free