State-of-the-art multilingual machine translation relies on a shared encoder-decoder. In this paper, we propose an alternative approach based on language-specific encoder-decoders, which can be easily extended to new languages by learning their corresponding modules. To establish a common interlingua representation, we simultaneously train N initial languages. Our experiments show that the proposed approach improves over the shared encoder-decoder for the initial languages and when adding new languages, without the need to retrain the remaining modules. All in all, our work closes the gap between shared and language-specific encoder-decoders, advancing toward modular multilingual machine translation systems that can be flexibly extended in lifelong learning settings.
CITATION STYLE
Escolano, C., Costa-Jussà, M. R., & Fonollosa, J. A. R. (2022). Multilingual Machine Translation: Deep Analysis of Language-Specific Encoder-Decoders. Journal of Artificial Intelligence Research, 73, 1535–1552. https://doi.org/10.1613/jair.1.12699
Mendeley helps you to discover research relevant for your work.