Encoding Sentence Position in Context-Aware Neural Machine Translation with Concatenation

6Citations
Citations of this article
20Readers
Mendeley users who have this article in their library.

Abstract

Context-aware translation can be achieved by processing a concatenation of consecutive sentences with the standard Transformer architecture. This paper investigates the intuitive idea of providing the model with explicit information about the position of the sentences contained in the concatenation window. We compare various methods to encode sentence positions into token representations, including novel methods. Our results show that the Transformer benefits from certain sentence position encodings methods on En→Ru, if trained with a context-discounted loss (Lupo et al., 2022b). However, the same benefits are not observed on En→De. Further empirical efforts are necessary to define the conditions under which the proposed approach is beneficial.

References Powered by Scopus

Principal component analysis: A review and recent developments

5779Citations
N/AReaders
Get full text

Neural machine translation of rare words with subword units

4457Citations
N/AReaders
Get full text

Context-aware neural machine translation learns anaphora resolution

208Citations
N/AReaders
Get full text

Cited by Powered by Scopus

Contextual Refinement of Translations: Large Language Models for Sentence and Document-Level Post-Editing

5Citations
N/AReaders
Get full text

An Evaluation of Source Factors in Concatenation-based Context-aware Neural Machine Translation

2Citations
N/AReaders
Get full text

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Cite

CITATION STYLE

APA

Lupo, L., Dinarelli, M., & Besacier, L. (2023). Encoding Sentence Position in Context-Aware Neural Machine Translation with Concatenation. In ACL 2023 - 4th Workshop on Insights from Negative Results in NLP, Proceedings (pp. 33–44). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2023.insights-1.4

Readers over time

‘23‘24‘250481216

Readers' Seniority

Tooltip

PhD / Post grad / Masters / Doc 5

71%

Lecturer / Post doc 1

14%

Researcher 1

14%

Readers' Discipline

Tooltip

Computer Science 9

82%

Medicine and Dentistry 1

9%

Neuroscience 1

9%

Save time finding and organizing research with Mendeley

Sign up for free
0