Aligning Cross-lingual Sentence Representations with Dual Momentum Contrast

13Citations
Citations of this article
75Readers
Mendeley users who have this article in their library.

Abstract

In this paper, we propose to align sentence representations from different languages into a unified embedding space, where semantic similarities (both cross-lingual and monolingual) can be computed with a simple dot product. Pre-trained language models are fine-tuned with the translation ranking task. Existing work (Feng et al., 2020) uses sentences within the same batch as negatives, which can suffer from the issue of easy negatives. We adapt MoCo (He et al., 2020) to further improve the quality of alignment. As the experimental results show, the sentence representations produced by our model achieve the new state-of-the-art on several tasks, including Tatoeba en-zh similarity search (Artetxe and Schwenk, 2019b), BUCC en-zh bitext mining, and semantic textual similarity on 7 datasets.

References Powered by Scopus

Momentum Contrast for Unsupervised Visual Representation Learning

9269Citations
N/AReaders
Get full text

A large annotated corpus for learning natural language inference

2530Citations
N/AReaders
Get full text

Supervised learning of universal sentence representations from natural language inference data

1534Citations
N/AReaders
Get full text

Cited by Powered by Scopus

SimKGC: Simple Contrastive Knowledge Graph Completion with Pre-trained Language Models

139Citations
N/AReaders
Get full text

CL-ReLKT: Cross-lingual Language Knowledge Transfer for Multilingual Retrieval Question Answering

5Citations
N/AReaders
Get full text

Supervised Cross-Momentum Contrast: Aligning representations with prototypical examples to enhance financial sentiment analysis

2Citations
N/AReaders
Get full text

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Cite

CITATION STYLE

APA

Wang, L., Zhao, W., & Liu, J. (2021). Aligning Cross-lingual Sentence Representations with Dual Momentum Contrast. In EMNLP 2021 - 2021 Conference on Empirical Methods in Natural Language Processing, Proceedings (pp. 3807–3815). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2021.emnlp-main.309

Readers over time

‘21‘22‘23‘24‘2509182736

Readers' Seniority

Tooltip

PhD / Post grad / Masters / Doc 19

70%

Researcher 6

22%

Professor / Associate Prof. 1

4%

Lecturer / Post doc 1

4%

Readers' Discipline

Tooltip

Computer Science 30

79%

Linguistics 5

13%

Engineering 2

5%

Neuroscience 1

3%

Save time finding and organizing research with Mendeley

Sign up for free
0