Factorized transformer for multi-domain neural machine translation

5Citations
Citations of this article
61Readers
Mendeley users who have this article in their library.

Abstract

Multi-Domain Neural Machine Translation (NMT) aims at building a single system that performs well on a range of target domains. However, along with the extreme diversity of cross-domain wording and phrasing style, the imperfections of training data distribution and the inherent defects of the current sequential learning process all contribute to making the task of multi-domain NMT very challenging. To mitigate these problems, we propose the Factorized Transformer, which consists of an in-depth factorization of the parameters of an NMT model, namely Transformer in this paper, into two categories: domain-shared ones that encode common cross-domain knowledge and domain-specific ones that are private for each constituent domain. We experiment with various designs of our model and conduct extensive validations on English to French open multi-domain dataset. Our approach achieves state-of-the-art performance and opens up new perspectives for multi-domain and open-domain applications.

Cite

CITATION STYLE

APA

Deng, Y., Yu, H., Yu, H., Duan, X., & Luo, W. (2020). Factorized transformer for multi-domain neural machine translation. In Findings of the Association for Computational Linguistics Findings of ACL: EMNLP 2020 (pp. 4221–4230). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2020.findings-emnlp.377

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free