Earlier attention? Aspect-aware LSTM for aspect-based sentiment analysis

39Citations
Citations of this article
82Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Aspect-based sentiment analysis (ABSA) aims to predict fine-grained sentiments of comments with respect to given aspect terms or categories. In previous ABSA methods, the importance of aspect has been realized and verified. Most existing LSTM-based models take aspect into account via the attention mechanism, where the attention weights are calculated after the context is modeled in the form of contextual vectors. However, aspect-related information may be already discarded and aspect-irrelevant information may be retained in classic LSTM cells in the context modeling process, which can be improved to generate more effective context representations. This paper proposes a novel variant of LSTM, termed as aspect-aware LSTM (AA-LSTM), which incorporates aspect information into LSTM cells in the context modeling stage before the attention mechanism. Therefore, our AA-LSTM can dynamically produce aspect-aware contextual representations. We experiment with several representative LSTM-based models by replacing the classic LSTM cells with the AA-LSTM cells. Experimental results on SemEval-2014 Datasets demonstrate the effectiveness of AA-LSTM.

Cite

CITATION STYLE

APA

Xing, B., Liao, L., Song, D., Wang, J., Zhang, F., Wang, Z., & Huang, H. (2019). Earlier attention? Aspect-aware LSTM for aspect-based sentiment analysis. In IJCAI International Joint Conference on Artificial Intelligence (Vol. 2019-August, pp. 5313–5319). International Joint Conferences on Artificial Intelligence. https://doi.org/10.24963/ijcai.2019/738

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free