Short Text Understanding Combining Text Conceptualization and Transformer Embedding

8Citations
Citations of this article
26Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

Short text understanding is a key task and popular issue in current natural language processing. Because the content of short texts is characterized by sparsity and semantic limitation, the traditional search methods that analyze only the semantics of literal text for short text understanding and similarity matching have certain restrictions. In this paper, we propose a combined method based on knowledge-based conceptualization and a transformer encoder. Specifically, for each term in a short text, we obtain its concepts and enrich the short text information from a knowledge base based on cooccurrence terms and concepts, construct a convolutional neural network (CNN) to capture local context information, and introduce the subnetwork structure based on a transformer embedding encoder. Then, we embed these concepts into a low-dimensional vector space to obtain more attention from these concepts based on a transformer. Finally, the concept space and transformer encoder space construct the understanding models. An experiment shows that the method in this paper can effectively capture more semantics of short texts and can be applied to a variety of applications, such as short text information retrieval and short text classification.

Cite

CITATION STYLE

APA

Li, J., Huang, G., Chen, J., & Wang, Y. (2019). Short Text Understanding Combining Text Conceptualization and Transformer Embedding. IEEE Access, 7, 122183–122191. https://doi.org/10.1109/ACCESS.2019.2938303

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free