BHGAttN: A Feature-Enhanced Hierarchical Graph Attention Network for Sentiment Analysis

1Citations
Citations of this article
8Readers
Mendeley users who have this article in their library.

Abstract

Recently, with the rise of deep learning, text classification techniques have developed rapidly. However, the existing work usually takes the entire text as the modeling object and pays less attention to the hierarchical structure within the text, ignoring the internal connection between the upper and lower sentences. To address these issues, this paper proposes a Bert-based hierarchical graph attention network model (BHGAttN) based on a large-scale pretrained model and graph attention network to model the hierarchical relationship of texts. During modeling, the semantic features are enhanced by the output of the intermediate layer of BERT, and the multilevel hierarchical graph network corresponding to each layer of BERT is constructed by using the dependencies between the whole sentence and the subsentence. This model pays attention to the layer-by-layer semantic information and the hierarchical relationship within the text. The experimental results show that the BHGAttN model exhibits significant competitive advantages compared with the current state-of-the-art baseline models.

Cite

CITATION STYLE

APA

Zhang, J., Cui, Z., Park, H. J., & Noh, G. (2022). BHGAttN: A Feature-Enhanced Hierarchical Graph Attention Network for Sentiment Analysis. Entropy, 24(11). https://doi.org/10.3390/e24111691

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free