Clustering-Aware Negative Sampling for Unsupervised Sentence Representation

11Citations
Citations of this article
19Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Contrastive learning has been widely studied in sentence representation learning. However, earlier works mainly focus on the construction of positive examples, while in-batch samples are often simply treated as negative examples. This approach overlooks the importance of selecting appropriate negative examples, potentially leading to a scarcity of hard negatives and the inclusion of false negatives. To address these issues, we propose ClusterNS (Clustering-aware Negative Sampling), a novel method that incorporates cluster information into contrastive learning for unsupervised sentence representation learning. We apply a modified K-means clustering algorithm to supply hard negatives and recognize in-batch false negatives during training, aiming to solve the two issues in one unified framework. Experiments on semantic textual similarity (STS) tasks demonstrate that our proposed ClusterNS compares favorably with baselines in unsupervised sentence representation learning. Our code has been made publicly available.

Cite

CITATION STYLE

APA

Deng, J., Wan, F., Yang, T., Quan, X., & Wang, R. (2023). Clustering-Aware Negative Sampling for Unsupervised Sentence Representation. In Proceedings of the Annual Meeting of the Association for Computational Linguistics (pp. 8713–8729). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2023.findings-acl.555

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free