A Document-Level Neural Machine Translation Model with Dynamic Caching Guided by Theme-Rheme Information

6Citations
Citations of this article
65Readers
Mendeley users who have this article in their library.

Abstract

Research on document-level Neural Machine Translation (NMT) models has attracted increasing attention in recent years. Although the proposed works have proved that the inter-sentence information is helpful for improving the performance of the NMT models, what information should be regarded as context remains ambiguous. To solve this problem, we proposed a novel cache-based document-level NMT model which conducts dynamic caching guided by theme-rheme information. The experiments on NIST evaluation sets demonstrate that our proposed model achieves substantial improvements over the state-of-the-art baseline NMT models. As far as we know, we are the first to introduce theme-rheme theory into the field of machine translation.

Cite

CITATION STYLE

APA

Tong, Y., Zheng, J., Zhu, H., Chen, Y., & Shi, X. (2020). A Document-Level Neural Machine Translation Model with Dynamic Caching Guided by Theme-Rheme Information. In COLING 2020 - 28th International Conference on Computational Linguistics, Proceedings of the Conference (pp. 4385–4395). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2020.coling-main.388

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free