As Little as Possible, as Much as Necessary: Detecting Over- and Undertranslations with Contrastive Conditioning

15Citations
Citations of this article
55Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Omission and addition of content is a typical issue in neural machine translation. We propose a method for detecting such phenomena with off-the-shelf translation models. Using contrastive conditioning, we compare the likelihood of a full sequence under a translation model to the likelihood of its parts, given the corresponding source or target sequence. This allows to pinpoint superfluous words in the translation and untranslated words in the source even in the absence of a reference translation. The accuracy of our method is comparable to a supervised method that requires a custom quality estimation model.

Cite

CITATION STYLE

APA

Vamvas, J., & Sennrich, R. (2022). As Little as Possible, as Much as Necessary: Detecting Over- and Undertranslations with Contrastive Conditioning. In Proceedings of the Annual Meeting of the Association for Computational Linguistics (Vol. 2, pp. 490–500). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2022.acl-short.53

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free