Efficient Minimum Error Rate Training and Minimum Bayes-Risk Decoding for translation hypergraphs and lattices

63Citations
Citations of this article
128Readers
Mendeley users who have this article in their library.

Abstract

Minimum Error Rate Training (MERT) and Minimum Bayes-Risk (MBR) decoding are used in most current state-of-the-art Statistical Machine Translation (SMT) systems. The algorithms were originally developed to work with N-best lists of translations, and recently extended to lattices that encode many more hypotheses than typical N-best lists. We here extend lattice-based MERT and MBR algorithms to work with hypergraphs that encode a vast number of translations produced by MT systems based on Synchronous Context Free Grammars. These algorithms are more efficient than the lattice-based versions presented earlier. We show how MERT can be employed to optimize parameters for MBR decoding. Our experiments show speedups from MERT and MBR as well as performance improvements from MBR decoding on several language pairs. © 2009 ACL and AFNLP.

Cite

CITATION STYLE

APA

Kumar, S., MacHerey, W., Dyer, C., & Och, F. (2009). Efficient Minimum Error Rate Training and Minimum Bayes-Risk Decoding for translation hypergraphs and lattices. In ACL-IJCNLP 2009 - Joint Conf. of the 47th Annual Meeting of the Association for Computational Linguistics and 4th Int. Joint Conf. on Natural Language Processing of the AFNLP, Proceedings of the Conf. (pp. 163–171). Association for Computational Linguistics (ACL). https://doi.org/10.3115/1687878.1687903

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free