This paper describes NiuTrans neural machine translation systems of the WMT20 news translation tasks. We participated in Japanese↔English, English→Chinese, Inuktitut→English and Tamil→English total five tasks and rank first in Japanese↔English both sides. We mainly utilized iterative back-translation, different depth and widen model architectures, iterative knowledge distillation and iterative fine-tuning. And we find that adequately widened and deepened the model simultaneously, the performance will significantly improve. Also, iterative fine-tuning strategy we implemented is effective during adapting domain. For Inuktitut→English and Tamil→English tasks, we built multilingual models separately and employed pretraining word embedding to obtain better performance.
CITATION STYLE
Zhang, Y., Wang, Z., Cao, R., Wei, B., Shan, W., Zhou, S., … Zhu, J. (2020). The NiuTrans Machine Translation Systems for WMT20. In 5th Conference on Machine Translation, WMT 2020 - Proceedings (pp. 338–345). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/w19-5325
Mendeley helps you to discover research relevant for your work.