Improving Contrastive Learning of Sentence Embeddings with Focal-InfoNCE

6Citations
Citations of this article
12Readers
Mendeley users who have this article in their library.
Get full text

Abstract

The recent success of SimCSE has greatly advanced state-of-the-art sentence representations. However, the original formulation of SimCSE does not fully exploit the potential of hard negative samples in contrastive learning. This study introduces an unsupervised contrastive learning framework that combines SimCSE with hard negative mining, aiming to enhance the quality of sentence embeddings. The proposed focal-InfoNCE function introduces self-paced modulation terms in the contrastive objective, downweighting the loss associated with easy negatives and encouraging the model focusing on hard negatives. Experimentation on various STS benchmarks shows that our method improves sentence embeddings in terms of Spearman's correlation and representation alignment and uniformity. Our code is available at: https://github.com/puerrrr/Focal-InfoNCE.

Cite

CITATION STYLE

APA

Hou, P., & Li, X. (2023). Improving Contrastive Learning of Sentence Embeddings with Focal-InfoNCE. In Findings of the Association for Computational Linguistics: EMNLP 2023 (pp. 4757–4762). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2023.findings-emnlp.315

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free