Abstract
Simultaneous machine translation (SIMT) involves translating source utterances to the target language in real-time before the speaker utterance completes. This paper proposes the multilingual approach to SIMT, where a single model simultaneously translates between multiple language-pairs. This not only results in more efficiency in terms of the number of models and parameters (hence simpler deployment), but may also lead to higher performing models by capturing commonalities among the languages. We further explore simple and effective multilingual architectures based on two strong recently proposed SIMT models. Our results on translating from two Germanic languages (German, Dutch) and three Romance languages (French, Italian, Romanian) into English show (i) the single multilingual model is on-par or better than individual models, and (ii) multilingual SIMT models trained based on language families are on-par or better than the universal model trained for all languages.
Cite
CITATION STYLE
Arthur, P., Ryu, D. K., & Haffari, G. (2021). Multilingual Simultaneous Neural Machine Translation. In Findings of the Association for Computational Linguistics: ACL-IJCNLP 2021 (pp. 4758–4766). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2021.findings-acl.420
Register to see more suggestions
Mendeley helps you to discover research relevant for your work.