Towards Example-Based NMT with Multi-Levenshtein Transformers

1Citations
Citations of this article
6Readers
Mendeley users who have this article in their library.

Abstract

Retrieval-Augmented Machine Translation (RAMT) is attracting growing attention. This is because RAMT not only improves translation metrics, but is also assumed to implement some form of domain adaptation. In this contribution, we study another salient trait of RAMT, its ability to make translation decisions more transparent by allowing users to go back to examples that contributed to these decisions. For this, we propose a novel architecture aiming to increase this transparency. This model adapts a retrieval-augmented version of the Levenshtein Transformer and makes it amenable to simultaneously edit multiple fuzzy matches found in memory. We discuss how to perform training and inference in this model, based on multiway alignment algorithms and imitation learning. Our experiments show that editing several examples positively impacts translation scores, notably increasing the number of target spans that are copied from existing instances.

Cite

CITATION STYLE

APA

Bouthors, M., Crego, J., & Yvon, F. (2023). Towards Example-Based NMT with Multi-Levenshtein Transformers. In EMNLP 2023 - 2023 Conference on Empirical Methods in Natural Language Processing, Proceedings (pp. 1830–1846). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2023.emnlp-main.113

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free