Specializing Word Vectors by Spectral Decomposition on Heterogeneously Twisted Graphs

0Citations
Citations of this article
57Readers
Mendeley users who have this article in their library.

Abstract

Traditional word vectors, such as word2vec and glove, have a well-known inclination to conflate the semantic similarity with other semantic relations. A retrofitting procedure may be needed to solve this issue. In this work, we propose a new retrofitting method called Heterogeneously Retrofitted Spectral Word Embedding. It heterogeneously twists the similarity matrix of word pairs with lexical constraints. A new set of word vectors is generated by a spectral decomposition of the similarity matrix, which has a linear algebraic analytic form. Our method has a competitive performance compared with the state-of-the-art retrofitting method such as AR (Mrkšić et al., 2017). In addition, since our embedding has a clear linear algebraic relationship with the similarity matrix, we carefully study the contribution of each component in our model. Last but not least, our method is very efficient to execute1

Cite

CITATION STYLE

APA

Ren, Y., & Du, Y. (2020). Specializing Word Vectors by Spectral Decomposition on Heterogeneously Twisted Graphs. In COLING 2020 - 28th International Conference on Computational Linguistics, Proceedings of the Conference (pp. 3599–3609). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2020.coling-main.321

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free