Using multi-sense vector embeddings for reverse dictionaries

10Citations
Citations of this article
83Readers
Mendeley users who have this article in their library.

Abstract

Popular word embedding methods such as word2vec and GloVe assign a single vector representation to each word, even if a word has multiple distinct meanings. Multi-sense embeddings instead provide different vectors for each sense of a word. However, they typically cannot serve as a drop-in replacement for conventional single-sense embeddings, because the correct sense vector needs to be selected for each word. In this work, we study the effect of multi-sense embeddings on the task of reverse dictionaries. We propose a technique to easily integrate them into an existing neural network architecture using an attention mechanism. Our experiments demonstrate that large improvements can be obtained when employing multi-sense embeddings both in the input sequence as well as for the target representation. An analysis of the sense distributions and of the learned attention is provided as well.

References Powered by Scopus

Long Short-Term Memory

77222Citations
N/AReaders
Get full text

GloVe: Global vectors for word representation

26958Citations
N/AReaders
Get full text

WordNet: A Lexical Database for English

11688Citations
N/AReaders
Get full text

Cited by Powered by Scopus

Towards Non-Ambiguous Reverse Dictionary

3Citations
N/AReaders
Get full text

Polysemy needs attention: Short-text topic discovery with global and multi-sense information

2Citations
N/AReaders
Get full text

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Cite

CITATION STYLE

APA

Hedderich, M. A., Yates, A., Klakow, D., & de Melo, G. (2019). Using multi-sense vector embeddings for reverse dictionaries. In IWCS 2019 - Proceedings of the 13th International Conference on Computational Semantics - Long Papers (pp. 247–258). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/w19-0421

Readers over time

‘19‘20‘21‘22‘23‘24‘2506121824

Readers' Seniority

Tooltip

PhD / Post grad / Masters / Doc 21

64%

Researcher 9

27%

Lecturer / Post doc 2

6%

Professor / Associate Prof. 1

3%

Readers' Discipline

Tooltip

Computer Science 33

80%

Linguistics 5

12%

Engineering 2

5%

Neuroscience 1

2%

Save time finding and organizing research with Mendeley

Sign up for free
0