Unsupervised sentence-embeddings by manifold approximation and projection

4Citations
Citations of this article
71Readers
Mendeley users who have this article in their library.

Abstract

The concept of unsupervised universal sentence encoders has gained traction recently, wherein pre-trained models generate effective task-agnostic fixed-dimensional representations for phrases, sentences and paragraphs. Such methods are of varying complexity, from simple weighted-averages of word vectors to complex language-models based on bidirectional transformers. In this work we propose a novel technique to generate sentence-embeddings in an unsupervised fashion by projecting the sentences onto a fixed-dimensional manifold with the objective of preserving local neighbourhoods in the original space. To delineate such neighbourhoods we experiment with several set-distance metrics, including the recently proposed Word Mover's distance, while the fixed-dimensional projection is achieved by employing a scalable and efficient manifold approximation method rooted in topological data analysis. We test our approach, which we term EMAP or Embeddings by Manifold Approximation and Projection, on six publicly available text-classification datasets of varying size and complexity. Empirical results show that our method consistently performs similar to or better than several alternative state-of-the-art approaches.

References Powered by Scopus

Support-Vector Networks

46029Citations
N/AReaders
Get full text

GloVe: Global vectors for word representation

27020Citations
N/AReaders
Get full text

Nonlinear dimensionality reduction by locally linear embedding

13192Citations
N/AReaders
Get full text

Cited by Powered by Scopus

dcor: Distance correlation and energy statistics in Python

10Citations
N/AReaders
Get full text

Extracting Sentence Embeddings from Pretrained Transformer Models

3Citations
N/AReaders
Get full text

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Cite

CITATION STYLE

APA

Kayal, S. (2021). Unsupervised sentence-embeddings by manifold approximation and projection. In EACL 2021 - 16th Conference of the European Chapter of the Association for Computational Linguistics, Proceedings of the Conference (pp. 1–11). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2021.eacl-main.1

Readers over time

‘21‘22‘23‘24‘2508162432

Readers' Seniority

Tooltip

PhD / Post grad / Masters / Doc 21

75%

Researcher 4

14%

Lecturer / Post doc 2

7%

Professor / Associate Prof. 1

4%

Readers' Discipline

Tooltip

Computer Science 25

76%

Linguistics 5

15%

Business, Management and Accounting 2

6%

Neuroscience 1

3%

Save time finding and organizing research with Mendeley

Sign up for free
0