Topic enhanced word vectors for documents representation

2Citations
Citations of this article
3Readers
Mendeley users who have this article in their library.
Get full text

Abstract

The words representation, as basic elements of documents representation, plays a crucial role in natural language processing. Topic models and Word embedding models have made great progress on words representation. There are some researches that combine the two models with each other, most of them assume that the semantics of context depends on the semantics of the current word and topic of the current word. This paper proposes a topic enhanced word vectors model (TEWV), which enhances the representation capability of word vectors by integrating topic information and semantics of context. Different from previous works, TEWV assumes that the semantics of the current word depends on the semantics of context and the topic, which is more consistent with common sense in dependency relationship. The experimental results on the 20NewsGroup dataset show that our approach achieves better performance than state-of-the-art methods.

Cite

CITATION STYLE

APA

Li, D., Li, Y., & Wang, S. (2017). Topic enhanced word vectors for documents representation. In Communications in Computer and Information Science (Vol. 774, pp. 166–177). Springer Verlag. https://doi.org/10.1007/978-981-10-6805-8_14

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free