Word associations and the distance properties of context-aware word embeddings

7Citations
Citations of this article
67Readers
Mendeley users who have this article in their library.

Abstract

What do people know when they know the meaning of words? Word associations have been widely used to tap into lexical representations and their structure, as a way of probing semantic knowledge in humans. We investigate whether current word embedding spaces (contextualized and uncontextualized) can be considered good models of human lexical knowledge by studying whether they have comparable characteristics to human association spaces. We study the three properties of association rank, asymmetry of similarity and triangle inequality. We find that word embeddings are good models of some word associations properties. They replicate well human associations between words, and, like humans, their context-aware variants show violations of the triangle inequality. While they do show asymmetry of similarities, their asymmetries do not map those of human association norms.

Cite

CITATION STYLE

APA

Rodriguez, M. A., & Merlo, P. (2020). Word associations and the distance properties of context-aware word embeddings. In CoNLL 2020 - 24th Conference on Computational Natural Language Learning, Proceedings of the Conference (pp. 376–385). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2020.conll-1.30

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free