Multilingual machine translation: Closing the gap between shared and language-specific encoder-decoders

29Citations
Citations of this article
96Readers
Mendeley users who have this article in their library.

Abstract

State-of-the-art multilingual machine translation relies on a universal encoder-decoder, which requires retraining the entire system to add new languages. In this paper, we propose an alternative approach that is based on language-specific encoder-decoders, and can thus be more easily extended to new languages by learning their corresponding modules. So as to encourage a common interlingua representation, we simultaneously train the N initial languages. Our experiments show that the proposed approach outperforms the universal encoder-decoder by 3.28 BLEU points on average, while allowing to add new languages without the need to retrain the rest of the modules. All in all, our work closes the gap between shared and language-specific encoder-decoders, advancing toward modular multilingual machine translation systems that can be flexibly extended in lifelong learning settings.

Cite

CITATION STYLE

APA

Escolano, C., Costa-Jussà, M. R., Fonollosa, J. A. R., & Artetxe, M. (2021). Multilingual machine translation: Closing the gap between shared and language-specific encoder-decoders. In EACL 2021 - 16th Conference of the European Chapter of the Association for Computational Linguistics, Proceedings of the Conference (pp. 944–948). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2021.eacl-main.80

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free