Recurrent Neural network based rule sequence model for statistical machine translation

7Citations
Citations of this article
91Readers
Mendeley users who have this article in their library.

Abstract

The inability to model long-distance depen-dency has been handicapping SMT for years. Specifically, the context independence as-sumption makes it hard to capture the depen-dency between translation rules. In this paper, we introduce a novel recurrent neural network based rule sequence model to incorporate arbi-trary long contextual information during esti-mating probabilities of rule sequences. More-over, our model frees the translation model from keeping huge and redundant grammars, resulting in more efficient training and de-coding. Experimental results show that our method achieves a 0.9 point BLEU gain over the baseline, and a significant reduction in rule table size for both phrase-based and hierarchi-cal phrase-based systems.

Cite

CITATION STYLE

APA

Yu, H., & Zhu, X. (2015). Recurrent Neural network based rule sequence model for statistical machine translation. In ACL-IJCNLP 2015 - 53rd Annual Meeting of the Association for Computational Linguistics and the 7th International Joint Conference on Natural Language Processing of the Asian Federation of Natural Language Processing, Proceedings of the Conference (Vol. 2, pp. 132–138). Association for Computational Linguistics (ACL). https://doi.org/10.3115/v1/p15-2022

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free