Zero-resource translation with multi-lingual neural machine translation

161Citations
Citations of this article
247Readers
Mendeley users who have this article in their library.

Abstract

In this paper, we propose a novel finetuning algorithm for the recently introduced multi-way, multilingual neural machine translate that enables zero-resource machine translation. When used together with novel many-to-one translation strategies, we empirically show that this finetuning algorithm allows the multi-way, multilingual model to translate a zero-resource language pair (1) as well as a single-pair neural translation model trained with up to 1M direct parallel sentences of the same language pair and (2) better than pivot-based translation strategy, while keeping only one additional copy of attention-related parameters.

Cite

CITATION STYLE

APA

Firat, O., Sankaran, B., Al-Onaizan, Y., Yarman Vural, F. T., & Cho, K. (2016). Zero-resource translation with multi-lingual neural machine translation. In EMNLP 2016 - Conference on Empirical Methods in Natural Language Processing, Proceedings (pp. 268–277). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/d16-1026

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free