MoCoUTRL: a momentum contrastive framework for unsupervised text representation learning

0Citations
Citations of this article
10Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

This paper presents MoCoUTRL: a Momentum Contrastive Framework for Unsupervised Text Representation Learning. This model improves two aspects of recently popular contrastive learning algorithms in natural language processing (NLP). Firstly, MoCoUTRL employs multi-granularity semantic contrastive learning objectives, enabling a more comprehensive understanding of the semantic features of samples. Secondly, MoCoUTRL uses a dynamic dictionary to act as the approximately ground-truth representation for each token, providing the pseudo labels for token-level contrastive learning. The MoCoUTRL can extend the use of pre-trained language models (PLM) and even large-scale language models (LLM) into a plug-and-play semantic feature extractor that can fuel multiple downstream tasks. Experimental results on several publicly available datasets and further theoretical analysis validate the effectiveness and interpretability of the proposed method in this paper.

References Powered by Scopus

ImageNet: A Large-Scale Hierarchical Image Database

51303Citations
N/AReaders
Get full text

GloVe: Global vectors for word representation

26934Citations
N/AReaders
Get full text

Rich feature hierarchies for accurate object detection and semantic segmentation

26347Citations
N/AReaders
Get full text

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Cite

CITATION STYLE

APA

Zou, A., Hao, W., Jin, D., Chen, G., & Sun, F. (2023). MoCoUTRL: a momentum contrastive framework for unsupervised text representation learning. Connection Science, 35(1). https://doi.org/10.1080/09540091.2023.2221406

Readers over time

‘23‘24‘2502468

Readers' Seniority

Tooltip

PhD / Post grad / Masters / Doc 3

50%

Lecturer / Post doc 2

33%

Professor / Associate Prof. 1

17%

Readers' Discipline

Tooltip

Computer Science 2

33%

Engineering 2

33%

Neuroscience 2

33%

Article Metrics

Tooltip
Mentions
News Mentions: 1

Save time finding and organizing research with Mendeley

Sign up for free
0