NICT Self-Training Approach to Neural Machine Translation at NMT-2018

11Citations
Citations of this article
73Readers
Mendeley users who have this article in their library.

Abstract

This paper describes the NICT neural machine translation system submitted at the NMT-2018 shared task. A characteristic of our approach is the introduction of self-training. Since our self-training does not change the model structure, it does not influence the efficiency of translation, such as the translation speed. The experimental results showed that the translation quality improved not only in the sequence-to-sequence (seq-to-seq) models but also in the transformer models.

Cite

CITATION STYLE

APA

Imamura, K., & Sumita, E. (2018). NICT Self-Training Approach to Neural Machine Translation at NMT-2018. In Proceedings of the Annual Meeting of the Association for Computational Linguistics (pp. 110–115). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/w18-2713

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free