Self-Guided Curriculum Learning for Neural Machine Translation

14Citations
Citations of this article
71Readers
Mendeley users who have this article in their library.
Get full text

Abstract

In supervised learning, a well-trained model should be able to recover ground truth accurately, i.e. the predicted labels are expected to resemble the ground truth labels as much as possible. Inspired by this, we formulate a difficulty criterion based on the recovery degrees of training examples. Motivated by the intuition that after skimming through the training corpus, the neural machine translation (NMT) model “knows” how to schedule a suitable curriculum according to learning difficulty, we propose a self-guided curriculum learning strategy that encourages the NMT model to learn from easy to hard on the basis of recovery degrees. Specifically, we adopt sentence-level BLEU score as the proxy of recovery degree. Experimental results on translation benchmarks including WMT14 English?German and WMT17 Chinese?English demonstrate that our proposed method considerably improves the recovery degree, thus consistently improving the translation performance.

Cite

CITATION STYLE

APA

Zhou, L., Ding, L., Duh, K., Watanabe, S., Sasano, R., & Takeda, K. (2021). Self-Guided Curriculum Learning for Neural Machine Translation. In IWSLT 2021 - 18th International Conference on Spoken Language Translation, Proceedings (pp. 206–214). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2021.iwslt-1.25

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free