Contrastive Loss is All You Need to Recover Analogies as Parallel Lines

0Citations
Citations of this article
7Readers
Mendeley users who have this article in their library.

Abstract

While static word embedding models are known to represent linguistic analogies as parallel lines in high-dimensional space, the underlying mechanism as to why they result in such geometric structures remains obscure. We find that an elementary contrastive-style optimization employed over distributional information performs competitively with popular word embedding models on analogy recovery tasks, while achieving dramatic speedups in training time. Further, we demonstrate that a contrastive loss is sufficient to create these parallel structures in word embeddings, and establish a precise relationship between the co-occurrence statistics and the geometric structure of the resulting word embeddings.

Cite

CITATION STYLE

APA

Ri, N., Lee, F. T., & Verma, N. (2023). Contrastive Loss is All You Need to Recover Analogies as Parallel Lines. In Proceedings of the Annual Meeting of the Association for Computational Linguistics (pp. 164–173). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2023.repl4nlp-1.14

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free