FreeTransfer-X: Safe and Label-Free Cross-Lingual Transfer from Off-the-Shelf Models

N/ACitations
Citations of this article
30Readers
Mendeley users who have this article in their library.

Abstract

Cross-lingual transfer (CLT) is of various applications. However, labeled cross-lingual corpus is expensive or even inaccessible, especially in the fields where labels are private, such as diagnostic results of symptoms in medicine and user profiles in business. Although being lack of labels, there are off-theshelf models in these sensitive fields. Instead of pursuing the original labels, a workaround for CLT is to transfer knowledge from the off-the-shelf models without labels. To this end, we define a novel CLT problem named FreeTransfer-X that aims to achieve knowledge transfer from the off-the-shelf models in rich-resource languages. To address the problem, we propose a 2-step knowledge distillation (KD, Hinton et al., 2015) framework based on multilingual pre-trained language models (mPLM)1. The significant improvement over strong neural machine translation (NMT) baselines demonstrates the effectiveness of the proposed method. In addition to reducing annotation cost and protecting private labels, the proposed method is compatible with different networks and easy to be deployed. Finally, a range of analyses indicate the great potential of the proposed method.

Cite

CITATION STYLE

APA

Guo, Y., Li, L., Jiang, X., & Liu, Q. (2022). FreeTransfer-X: Safe and Label-Free Cross-Lingual Transfer from Off-the-Shelf Models. In Findings of the Association for Computational Linguistics: NAACL 2022 - Findings (pp. 217–228). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2022.findings-naacl.16

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free