Two-stage attention network for aspect-level sentiment classification

1Citations
Citations of this article
3Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Currently, most of attention-based works adopt single-stage attention processes during generating context representations toward aspect, but their work lacks the deliberation process: A generated and aspect-related representation is directly used as final output without further polishing. In this work, we introduce the deliberation process to model context for further polishing of attention weights, and then propose a two-stage attention network for aspect-level sentiment classification. The network uses of a two-level attention model with LSTM, where the first-stage attention generates a raw aspect-related representation and the second-stage attention polishes and refines the raw representation by deliberation process. Since the deliberation component has global information what the representation to be generated might be, it has the potential to generate a better aspect-related representation by secondly looking into hidden state produced by LSTM. Experimental results on the dataset of SemEval-2016 task 5 about Laptop indicates that our model achieved the state-of-the-art accuracy of 76.56%.

Cite

CITATION STYLE

APA

Gao, K., Xu, H., Gao, C., Sun, X., Deng, J., & Zhang, X. (2018). Two-stage attention network for aspect-level sentiment classification. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 11304 LNCS, pp. 316–325). Springer Verlag. https://doi.org/10.1007/978-3-030-04212-7_27

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free