Improved neural machine translation with source syntax

28Citations
Citations of this article
32Readers
Mendeley users who have this article in their library.

Abstract

Neural Machine Translation (NMT) based on the encoder-decoder architecture has recently achieved the state-of-the-art performance. Researchers have proven that extending word level attention to phrase level attention by incorporating source-side phrase structure can enhance the attention model and achieve promising improvement. However, word dependencies that can be crucial to correctly understand a source sentence are not always in a consecutive fashion (i.e. phrase structure), sometimes they can be in long distance. Phrase structures are not the best way to explicitly model long distance dependencies. In this paper we propose a simple but effective method to incorporate source-side long distance dependencies into NMT. Our method based on dependency trees enriches each source state with global dependency structures, which can better capture the inherent syntactic structure of source sentences. Experiments on Chinese-English and English-Japanese translation tasks show that our proposed method outperforms state-of-the-art SMT and NMT baselines.

Cite

CITATION STYLE

APA

Wu, S., Zhou, M., & Zhang, D. (2017). Improved neural machine translation with source syntax. In IJCAI International Joint Conference on Artificial Intelligence (Vol. 0, pp. 4179–4185). International Joint Conferences on Artificial Intelligence. https://doi.org/10.24963/ijcai.2017/584

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free