Document-level neural machine translation with associated memory network

1Citations
Citations of this article
10Readers
Mendeley users who have this article in their library.

Abstract

Standard neural machine translation (NMT) is on the assumption that the document-level context is independent. Most existing document-level NMT approaches are satisfied with a smattering sense of global document-level information, while this work focuses on exploiting detailed document-level context in terms of a memory network. The capacity of the memory network that detecting the most relevant part of the current sentence from memory renders a natural solution to model the rich document-level context. In this work, the proposed document-aware memory network is implemented to enhance the Transformer NMT baseline. Experiments on several tasks show that the proposed method significantly improves the NMT performance over strong Transformer baselines and other related studies.

Cite

CITATION STYLE

APA

JIANG, S., WANG, R., LI, Z., UTIYAMA, M., CHEN, K., SUMITA, E., … LU, B. L. (2021). Document-level neural machine translation with associated memory network. IEICE Transactions on Information and Systems, E104D(10), 1712–1723. https://doi.org/10.1587/transinf.2020EDP7244

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free