Despite its potential to improve lexical selection, most state-of-The-Art machine translation systems take only minimal contextual information into account. We capture context with a topic model over distributional profiles built from the context words of each translation unit. Topic distributions are inferred for each translation unit and used to adapt the translation model dynamically to a given test context by measuring their similarity. We show that combining information from both local and global test contexts helps to improve lexical selection and outperforms a baseline system by up to 1.15 BLEU. We test our topic-Adapted model on a diverse data set containing documents from three different domains and achieve competitive performance in comparison with two supervised domain-Adapted systems.
CITATION STYLE
Hasler, E., Haddow, B., & Koehn, P. (2014). Dynamic topic adaptation for smt using distributional profiles. In Proceedings of the Annual Meeting of the Association for Computational Linguistics (pp. 445–456). Association for Computational Linguistics (ACL). https://doi.org/10.3115/v1/w14-3358
Mendeley helps you to discover research relevant for your work.