Joint semantic and distributional word representations with multi-graph embeddings

0Citations
Citations of this article
66Readers
Mendeley users who have this article in their library.

Abstract

Word embeddings continue to be of great use for NLP researchers and practitioners due to their training speed and easiness of use and distribution. Prior work has shown that the representation of those words can be improved by the use of semantic knowledge-bases. In this paper we propose a novel way of combining those knowledge-bases while the lexical information of co-occurrences of words remains. It is conceptually clear, as it consists in mapping both distributional and semantic information into a multi-graph and modifying existing node embeddings techniques to compute word representations. Our experiments show improved results compared to vanilla word embeddings, retrofitting and concatenation techniques using the same information, on a variety of data-sets of word similarities.

Cite

CITATION STYLE

APA

Moreux, P. D., & Galle, M. (2019). Joint semantic and distributional word representations with multi-graph embeddings. In EMNLP-IJCNLP 2019 - Graph-Based Methods for Natural Language Processing - Proceedings of the 13th Workshop (pp. 118–123). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/d19-5314

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free