Selective Attention Based Graph Convolutional Networks for Aspect-Level Sentiment Classification

31Citations
Citations of this article
115Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Recent work on aspect-level sentiment classification has employed Graph Convolutional Networks (GCN) over dependency trees to learn interactions between aspect terms and opinion words. In some cases, the corresponding opinion words for an aspect term cannot be reached within two hops on dependency trees, which requires more GCN layers to model. However, GCNs often achieve the best performance with two layers, and deeper GCNs do not bring any additional gain. Therefore, we design a novel selective attention based GCN model. On one hand, the proposed model enables the direct interaction between aspect terms and context words via the self-attention operation without the distance limitation on dependency trees. On the other hand, a top-k selection procedure is designed to locate opinion words by selecting k context words with the highest attention scores. We conduct experiments on several commonly used benchmark datasets and the results show that our proposed SA-GCN outperforms strong baseline models.

Cite

CITATION STYLE

APA

Hou, X., Huang, J., Wang, G., Qi, P., He, X., & Zhou, B. (2021). Selective Attention Based Graph Convolutional Networks for Aspect-Level Sentiment Classification. In TextGraphs 2021 - Graph-Based Methods for Natural Language Processing, Proceedings of the 15th Workshop - in conjunction with the 2021 Annual Conference of the North American Chapter of the Association for Computational Linguistics, NAACL 2021 (pp. 83–93). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2021.textgraphs-1.8

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free