Neural machine translation by minimising the bayes-risk with respect to syntactic translation lattices

39Citations
Citations of this article
100Readers
Mendeley users who have this article in their library.

Abstract

We present a novel scheme to combine neural machine translation (NMT) with traditional statistical machine translation (SMT). Our approach borrows ideas from linearised lattice minimum Bayes-risk decoding for SMT. The NMT score is combined with the Bayes-risk of the translation according the SMT lattice. This makes our approach much more flexible than n-best list or lattice rescoring as the neural decoder is not restricted to the SMT search space. We show an efficient and simple way to integrate risk estimation into the NMT decoder which is suitable for word-level as well as subword-unit-level NMT. We test our method on English- German and Japanese-English and report significant gains over lattice rescoring on several data sets for both single and ensembled NMT. The MBR decoder produces entirely new hypotheses far beyond simply rescoring the SMT search space or fixing UNKs in the NMT output.

Cite

CITATION STYLE

APA

Stahlberg, F., De Gispert, A., Hasler, E., & Byrne, B. (2017). Neural machine translation by minimising the bayes-risk with respect to syntactic translation lattices. In 15th Conference of the European Chapter of the Association for Computational Linguistics, EACL 2017 - Proceedings of Conference (Vol. 2, pp. 362–368). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/e17-2058

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free