Although multilingual neural machine translation (MNMT) enables multiple language translations, the training process is based on independent multilingual objectives. Most multilingual models can not explicitly exploit different language pairs to assist each other, ignoring the relationships among them. In this work, we propose a novel agreement-based method to encourage multilingual agreement among different translation directions, which minimizes the differences among them. We combine the multilingual training objectives with the agreement term by randomly substituting some fragments of the source language with their counterpart translations of auxiliary languages. To examine the effectiveness of our method, we conduct experiments on the multilingual translation task of 10 language pairs. Experimental results show that our method achieves significant improvements over the previous multilingual baselines.
CITATION STYLE
Yang, J., Yin, Y., Ma, S., Huang, H., Zhang, D., Li, Z., & Wei, F. (2021). Multilingual Agreement for Multilingual Neural Machine Translation. In ACL-IJCNLP 2021 - 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing, Proceedings of the Conference (Vol. 2, pp. 233–239). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2021.acl-short.31
Mendeley helps you to discover research relevant for your work.