Quick Back-Translation for Unsupervised Machine Translation

2Citations
Citations of this article
11Readers
Mendeley users who have this article in their library.
Get full text

Abstract

The field of unsupervised machine translation has seen significant advancement from the marriage of the Transformer and the back-translation algorithm. The Transformer is a powerful generative model, and back-translation leverages Transformer's high-quality translations for iterative self-improvement. However, the Transformer is encumbered by the run-time of autoregressive inference during back-translation, and back-translation is limited by a lack of synthetic data efficiency. We propose a two-for-one improvement to Transformer back-translation: Quick Back-Translation (QBT). QBT re-purposes the encoder as a generative model, and uses encoder-generated sequences to train the decoder in conjunction with the original autoregressive back-translation step, improving data throughput and utilization. Experiments on various WMT benchmarks demonstrate that a relatively small number of refining steps of QBT improve current unsupervised machine translation models, and that QBT dramatically outperforms standard back-translation only method in terms of training efficiency for comparable translation qualities.

Cite

CITATION STYLE

APA

Brimacombe, B., & Zhou, J. (2023). Quick Back-Translation for Unsupervised Machine Translation. In Findings of the Association for Computational Linguistics: EMNLP 2023 (pp. 8521–8534). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2023.findings-emnlp.571

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free