The aim of aspect-based sentiment analysis (ABSA) is to determine the sentiment polarity of a specific aspect in a sentence. Most recent works resort to exploiting syntactic information by utilizing Graph Attention Network (GAT) over dependency trees, and have achieved great progress. However, the models based on traditional GAT do not fully exploit syntactic information such as the diversified types of dependency relations. The variant of GAT called relational graph attention network (R-GAT) takes different types of dependency relations into consideration, but ignores the information hidden in the word-pairs. In this paper, we propose a novel model called weighted relational graph attention network (WRGAT). It can exploit more accurate syntactic information by employing a weighted relational head, in which the contextual information from word-pairs is introduced into the computation of the attention weights of dependency relations. Furthermore, we employ BERT instead of Bi-directional Long Short-term Memory (Bi-LSTM) to generate contextual representations and aspect representations respectively as inputs to the WRGAT, and adopt an index selection method to keep the word-level dependencies consistent with the word-piece unit of BERT. With the proposed BERT-WRGAT architecture, we achieve the state-of-the-art performances on four ABSA datasets.
CITATION STYLE
Huo, Y., Jiang, D., & Sahli, H. (2021). Aspect-based Sentiment Analysis with Weighted Relational Graph Attention Network. In ICMI 2021 Companion - Companion Publication of the 2021 International Conference on Multimodal Interaction (pp. 63–70). Association for Computing Machinery, Inc. https://doi.org/10.1145/3461615.3491104
Mendeley helps you to discover research relevant for your work.