Mimicking Infants' Bilingual Language Acquisition for Domain Specialized Neural Machine Translation

4Citations
Citations of this article
9Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

Existing methods of training domain-specialized neural machine translation (DS-NMT) models are based on the pretrain-finetuning approach (PFA). In this study, we reinterpret existing methods based on the perspective of cognitive science related to cross language speech perception. We propose the cross communication method (CCM), a new DS-NMT training approach. Inspired by the learning method of infants, we perform DS-NMT training by configuring and training DC and GC concurrently in batches. Quantitative and qualitative analysis of our experimental results show that CCM can achieve superior performance compared to the conventional methods. Additionally, we conducted an experiment considering the DS-NMT service to meet industrial demands.

Cite

CITATION STYLE

APA

Park, C., Go, W. Y., Eo, S., Moon, H., Lee, S., & Lim, H. (2022). Mimicking Infants’ Bilingual Language Acquisition for Domain Specialized Neural Machine Translation. IEEE Access, 10, 38684–38693. https://doi.org/10.1109/ACCESS.2022.3165572

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free