On the use of BERT for neural machine translation

55Citations
Citations of this article
176Readers
Mendeley users who have this article in their library.

Abstract

Exploiting large pretrained models for various NMT tasks have gained a lot of visibility recently. In this work we study how BERT pretrained models could be exploited for supervised Neural Machine Translation. We compare various ways to integrate pretrained BERT model with NMT model and study the impact of the monolingual data used for BERT training on the final translation quality. We use WMT-14 English-German, IWSLT15 English-German and IWSLT14 English-Russian datasets for these experiments. In addition to standard task test set evaluation, we perform evaluation on out-of-domain test sets and noise injected test sets, in order to assess how BERT pretrained representations affect model robustness.

Cite

CITATION STYLE

APA

Clinchant, S., Jung, K. W., & Nikoulina, V. (2019). On the use of BERT for neural machine translation. In EMNLP-IJCNLP 2019 - Proceedings of the 3rd Workshop on Neural Generation and Translation (pp. 108–117). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/d19-5611

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free