Sometimes We Want Ungrammatical Translations

9Citations
Citations of this article
46Readers
Mendeley users who have this article in their library.

Abstract

Rapid progress in Neural Machine Translation (NMT) systems over the last few years has focused primarily on improving translation quality, and as a secondary focus, improving robustness to perturbations (e.g. spelling). While performance and robustness are important objectives, by over-focusing on these, we risk overlooking other important properties. In this paper, we draw attention to the fact that for some applications, faithfulness to the original (input) text is important to preserve, even if it means introducing unusual language patterns in the (output) translation. We propose a simple, novel way to quantify whether an NMT system exhibits robustness or faithfulness, by focusing on the case of word-order perturbations. We explore a suite of functions to perturb the word order of source sentences without deleting or injecting tokens, and measure their effects on the target side. Across several experimental conditions, we observe a strong tendency towards robustness rather than faithfulness. These results allow us to better understand the trade-off between faithfulness and robustness in NMT, and opens up the possibility of developing systems where users have more autonomy and control in selecting which property is best suited for their use case.

Cite

CITATION STYLE

APA

Parthasarathi, P., Sinha, K., Pineau, J., & Williams, A. (2021). Sometimes We Want Ungrammatical Translations. In Findings of the Association for Computational Linguistics, Findings of ACL: EMNLP 2021 (pp. 3205–3227). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2021.findings-emnlp.275

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free