Sampling and Filtering of Neural Machine Translation Distillation Data

0Citations
Citations of this article
60Readers
Mendeley users who have this article in their library.

Abstract

In most of neural machine translation distillation or stealing scenarios, the goal is to preserve the performance of the target model (teacher). The highest-scoring hypothesis of the teacher model is commonly used to train a new model (student). If reference translations are also available, then better hypotheses (with respect to the references) can be upsampled and poor hypotheses either removed or under-sampled. This paper explores the importance sampling method landscape (pruning, hypothesis upsampling and undersampling, deduplication and their combination) with English to Czech and English to German MT models using standard MT evaluation metrics. We show that careful upsampling and combination with the original data leads to better performance when compared to training only on the original or synthesized data or their direct combination.

References Powered by Scopus

SentencePiece: A simple and language independent subword tokenizer and detokenizer for neural text processing

2096Citations
N/AReaders
Get full text

A Call for Clarity in Reporting BLEU Scores

2000Citations
N/AReaders
Get full text

Chrf: Character n-gram f-score for automatic mt evaluation

1023Citations
N/AReaders
Get full text

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Cite

CITATION STYLE

APA

Zouhar, V. (2021). Sampling and Filtering of Neural Machine Translation Distillation Data. In NAACL-HLT 2021 - 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Proceedings of the Student Research Workshop (pp. 1–8). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2021.naacl-srw.1

Readers over time

‘21‘22‘23‘24‘2506121824

Readers' Seniority

Tooltip

PhD / Post grad / Masters / Doc 15

65%

Researcher 4

17%

Lecturer / Post doc 3

13%

Professor / Associate Prof. 1

4%

Readers' Discipline

Tooltip

Computer Science 20

71%

Linguistics 5

18%

Social Sciences 2

7%

Neuroscience 1

4%

Save time finding and organizing research with Mendeley

Sign up for free
0