Neural machine translation (NMT) using translation memory (TM) has been introduced as an emergent technique for improving machine translation systems (MTS). In this study, we propose an end-to-end NMT model with TM by exploiting the diversity of the retrieval-augmented phase using maximal marginal relevance (MMR). In particular, the proposed model is designed with monolingual TM, which is able to support low-resource scenarios. Furthermore, the memory retriever and translation models are jointly trained to improve translation performance. For the experiment, we use IWSLT15 (En ⟷ Vi) as a benchmark dataset to evaluate the performance of the proposed method. Accordingly, the experiential results show the effectiveness of the proposed method compared with strong baselines in this research field.
CITATION STYLE
Nguyen, Q. C., Doan, X. D., Nguyen, V. V., & Bui, K. H. N. (2023). Neural Machine Translation with Diversity-Enabled Translation Memory. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 13995 LNAI, pp. 322–333). Springer Science and Business Media Deutschland GmbH. https://doi.org/10.1007/978-981-99-5834-4_26
Mendeley helps you to discover research relevant for your work.