Large Scale Substitution-based Word Sense Induction

10Citations
Citations of this article
47Readers
Mendeley users who have this article in their library.

Abstract

We present a word-sense induction method based on pre-trained masked language models (MLMs), which can cheaply scale to large vocabularies and large corpora. The result is a corpus which is sense-tagged according to a corpus-derived sense inventory and where each sense is associated with indicative words. Evaluation on English Wikipedia that was sense-tagged using our method shows that both the induced senses, and the per-instance sense assignment, are of high quality even compared to WSD methods, such as Babelfy. Furthermore, by training a static word embeddings algorithm on the sense-tagged corpus, we obtain high-quality static senseful embeddings. These outperform existing senseful embeddings methods on the WiC dataset and on a new outlier detection dataset we developed. The data driven nature of the algorithm allows to induce corpora-specific senses, which may not appear in standard sense inventories, as we demonstrate using a case study on the scientific domain.

References Powered by Scopus

Fast unfolding of communities in large networks

15204Citations
8155Readers
Get full text

context2vec: Learning generic context embedding with bidirectional LSTM

426Citations
511Readers

Cited by Powered by Scopus

Substitution-based Semantic Change Detection using Contextual Embeddings

10Citations
11Readers

Words as Gatekeepers: Measuring Discipline-specific Terms and Meanings in Scholarly Publications

8Citations
18Readers
6Citations
5Readers
Get full text

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Cite

CITATION STYLE

APA

Eyal, M., Sadde, S., Taub-Tabib, H., & Goldberg, Y. (2022). Large Scale Substitution-based Word Sense Induction. In Proceedings of the Annual Meeting of the Association for Computational Linguistics (Vol. 1, pp. 4738–4752). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2022.acl-long.325

Readers over time

‘21‘22‘23‘24‘2505101520

Readers' Seniority

Tooltip

PhD / Post grad / Masters / Doc 7

50%

Researcher 4

29%

Professor / Associate Prof. 2

14%

Lecturer / Post doc 1

7%

Readers' Discipline

Tooltip

Computer Science 14

74%

Linguistics 3

16%

Neuroscience 1

5%

Agricultural and Biological Sciences 1

5%

Save time finding and organizing research with Mendeley

Sign up for free
0