Abstract
We present effective pre-training strategies for neural machine translation (NMT) using parallel corpora involving a pivot language, i.e., source-pivot and pivot-target, leading to a significant improvement in source!target translation. We propose three methods to increase the relation among source, pivot, and target languages in the pre-training: 1) step-wise training of a single model for different language pairs, 2) additional adapter component to smoothly connect pre-trained encoder and decoder, and 3) cross-lingual encoder training via autoencoding of the pivot language. Our methods greatly outperform multilingual models up to +2.6% BLEU in WMT 2019 French!German and German!Czech tasks. We show that our improvements are valid also in zero-shot/zero-resource scenarios.
Cite
CITATION STYLE
Kim, Y., Petrov, P., Petrushkov, P., Khadivi, S., & Ney, H. (2019). Pivot-based transfer learning for neural machine translation between non-English languages. In EMNLP-IJCNLP 2019 - 2019 Conference on Empirical Methods in Natural Language Processing and 9th International Joint Conference on Natural Language Processing, Proceedings of the Conference (pp. 866–876). Association for Computational Linguistics. https://doi.org/10.18653/v1/D19-1080
Register to see more suggestions
Mendeley helps you to discover research relevant for your work.