Enhancing Attention-Based LSTM with Position Context for Aspect-Level Sentiment Classification

94Citations
Citations of this article
103Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

Aspect-level sentiment classification is an interesting but challenging research problem, namely, the prediction of the sentiment polarity toward a specific aspect term of an opinionated sentence. Previous attention-based recurrent neural networks have been proposed to address this problem because attention mechanism is capable of finding out those words contributing more to the prediction than others and have shown great promise. However, the major drawback of these attention-based approaches is that the explicit position context is ignored. Drawing inspirations from the manner modeling the position context in information retrieval and question answering, we hypothesize that we should pay much more attention to the context words neighboring to the aspect than those far away, especially when one review sentence is a long sequence or contains multiple aspect terms. Based on this conjecture, in this paper, we put forward a new attentive LSTM model, dubbed PosATT-LSTM, which not only takes into account the importance of each context word but also incorporates the position-aware vectors, which represents the explicit position context between the aspect and its context words. We conduct substantial experiments on the SemEval 2014 datasets, and the encouraging results indicate the efficacy of our proposed approach.

Cite

CITATION STYLE

APA

Zeng, J., Ma, X., & Zhou, K. (2019). Enhancing Attention-Based LSTM with Position Context for Aspect-Level Sentiment Classification. IEEE Access, 7, 20462–20471. https://doi.org/10.1109/ACCESS.2019.2893806

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free