Scalable cross-lingual transfer of neural sentence embeddings

1Citations
Citations of this article
80Readers
Mendeley users who have this article in their library.

Abstract

We develop and investigate several crosslingual alignment approaches for neural sentence embedding models, such as the supervised inference classifier, InferSent, and sequential encoder-decoder models. We evaluate three alignment frameworks applied to these models: joint modeling, representation transfer learning, and sentence mapping, using parallel text to guide the alignment. Our results support representation transfer as a scalable approach for modular cross-lingual alignment of neural sentence embeddings, where we observe better performance compared to joint models in intrinsic and extrinsic evaluations, particularly with smaller sets of parallel data.

Cite

CITATION STYLE

APA

Aldarmaki, H., & Diab, M. (2019). Scalable cross-lingual transfer of neural sentence embeddings. In *SEM@NAACL-HLT 2019 - 8th Joint Conference on Lexical and Computational Semantics (pp. 51–60). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/s19-1006

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free