This paper describes the NICT neural machine translation system submitted at the NMT-2018 shared task. A characteristic of our approach is the introduction of self-training. Since our self-training does not change the model structure, it does not influence the efficiency of translation, such as the translation speed. The experimental results showed that the translation quality improved not only in the sequence-to-sequence (seq-to-seq) models but also in the transformer models.
CITATION STYLE
Imamura, K., & Sumita, E. (2018). NICT Self-Training Approach to Neural Machine Translation at NMT-2018. In Proceedings of the Annual Meeting of the Association for Computational Linguistics (pp. 110–115). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/w18-2713
Mendeley helps you to discover research relevant for your work.