CSE: Conceptual sentence embeddings based on attention model

52Citations
Citations of this article
145Readers
Mendeley users who have this article in their library.

Abstract

Most sentence embedding models typically represent each sentence only using word surface, which makes these models indiscriminative for ubiquitous homonymy and polysemy. In order to enhance representation capability of sentence, we employ conceptualization model to assign associated concepts for each sentence in the text corpus, and then learn conceptual sentence embedding (CSE). Hence, this semantic representation is more expressive than some widely-used text representation models such as latent topic model, especially for short-text. Moreover, we further extend CSE models by utilizing a local attention-based model that select relevant words within the context to make more efficient prediction. In the experiments, we evaluate the CSE models on two tasks, text classification and information retrieval. The experimental results show that the proposed models outperform typical sentence embed-ding models.

Cite

CITATION STYLE

APA

Wang, Y., Huang, H., Feng, C., Zhou, Q., Gu, J., & Gao, X. (2016). CSE: Conceptual sentence embeddings based on attention model. In 54th Annual Meeting of the Association for Computational Linguistics, ACL 2016 - Long Papers (Vol. 1, pp. 505–515). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/p16-1048

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free