Target-based attention model for aspect-level sentiment analysis

3Citations
Citations of this article
2Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Aspect-level sentiment classification, which aims to determine the sentiment polarity of the specific target word or phrase of a sentence, is a crucial task in natural language processing (NLP). Previous works have proposed various attention methods to capture the important part of the context for the desired target. However, these methods have less interaction between aspects and contexts and can not accurately quantify the importance of context words with the information of aspect. To address these issues, we firstly proposed a novel target-based attention model (TBAM) for aspect-level sentiment analysis, which employs an attention mechanism between the position-aware context representation matrix. TBAM can generate more accurate attention scores between aspects and contexts at the word level in a joint way, and generate more discriminative features for classification. Experimental results show that our model achieves a state-of-the-art performance on three public datasets compared to other architectures.

Cite

CITATION STYLE

APA

Chen, W., Yu, W., Zhang, Z., Zhang, Y., Xu, K., Zhang, F., … Yang, Z. (2019). Target-based attention model for aspect-level sentiment analysis. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 11955 LNCS, pp. 259–269). Springer. https://doi.org/10.1007/978-3-030-36718-3_22

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free