A retrieve-and-rewrite initialization method for unsupervised machine translation

14Citations
Citations of this article
109Readers
Mendeley users who have this article in their library.

Abstract

The commonly used framework for unsupervised machine translation builds initial translation models of both translation directions, and then performs iterative back-translation to jointly boost their translation performance. The initialization stage is very important since bad initialization may wrongly squeeze the search space, and too much noise introduced in this stage may hurt the final performance. In this paper, we propose a novel retrieval and rewriting based method to better initialize unsupervised translation models. We first retrieve semantically comparable sentences from monolingual corpora of two languages and then rewrite the target side to minimize the semantic gap between the source and retrieved targets with a designed rewriting model. The rewritten sentence pairs are used to initialize SMT models which are used to generate pseudo data for two NMT models, followed by the iterative back-translation. Experiments show that our method can build better initial unsupervised translation models and improve the final translation performance by over 4 BLEU scores.

Cite

CITATION STYLE

APA

Ren, S., Wu, Y., Liu, S., Zhou, M., & Ma, S. (2020). A retrieve-and-rewrite initialization method for unsupervised machine translation. In Proceedings of the Annual Meeting of the Association for Computational Linguistics (pp. 3498–3504). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2020.acl-main.320

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free