Unsupervised Neural Machine Translation with Universal Grammar

5Citations
Citations of this article
55Readers
Mendeley users who have this article in their library.

Abstract

Machine translation usually relies on parallel corpora to provide parallel signals for training. The advent of unsupervised machine translation has brought machine translation away from this reliance, though performance still lags behind traditional supervised machine translation. In unsupervised machine translation, the model seeks symmetric language similarities as a source of weak parallel signal to achieve translation. Chomsky's Universal Grammar theory postulates that grammar is an innate form of knowledge to humans and is governed by universal principles and constraints. Therefore, in this paper, we seek to leverage such shared grammar clues to provide more explicit language parallel signals to enhance the training of unsupervised machine translation models. Through experiments on multiple typical language pairs, we demonstrate the effectiveness of our proposed approaches.

Cite

CITATION STYLE

APA

Li, Z., Utiyama, M., Sumita, E., & Zhao, H. (2021). Unsupervised Neural Machine Translation with Universal Grammar. In EMNLP 2021 - 2021 Conference on Empirical Methods in Natural Language Processing, Proceedings (pp. 3249–3264). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2021.emnlp-main.261

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free