Random restarts in minimum error rate training for statistical machine translation

26Citations
Citations of this article
93Readers
Mendeley users who have this article in their library.

Abstract

Och's (2003) minimum error rate training (MERT) procedure is the most commonly used method for training feature weights in statistical machine translation (SMT) models. The use of multiple randomized starting points in MERT is a well-established practice, although there seems to be no published systematic study of its benefits. We compare several ways of performing random restarts with MERT. We find that all of our random restart methods outperform MERT without random restarts, and we develop some refinements of random restarts that are superior to the most common approach with regard to resulting model quality and training time. © 2008. Licensed under the Creative Commons.

Cite

CITATION STYLE

APA

Moore, R. C., & Quirk, C. (2008). Random restarts in minimum error rate training for statistical machine translation. In Coling 2008 - 22nd International Conference on Computational Linguistics, Proceedings of the Conference (Vol. 1, pp. 585–592). Association for Computational Linguistics (ACL). https://doi.org/10.3115/1599081.1599155

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free