Scheduled Multi-Task Learning: From Syntax to Translation

  • Kiperwasser E
  • Ballesteros M
N/ACitations
Citations of this article
129Readers
Mendeley users who have this article in their library.

Abstract

Neural encoder-decoder models of machine translation have achieved impressive results, while learning linguistic knowledge of both the source and target languages in an implicit end-to-end manner. We propose a framework in which our model begins learning syntax and translation interleaved, gradually putting more focus on translation. Using this approach, we achieve considerable improvements in terms of BLEU score on relatively large parallel corpus (WMT14 English to German) and a low-resource (WIT German to English) setup.

Cite

CITATION STYLE

APA

Kiperwasser, E., & Ballesteros, M. (2018). Scheduled Multi-Task Learning: From Syntax to Translation. Transactions of the Association for Computational Linguistics, 6, 225–240. https://doi.org/10.1162/tacl_a_00017

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free