Syntax-Directed Hybrid Attention Network for Aspect-Level Sentiment Analysis

20Citations
Citations of this article
35Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

Aspect-level sentiment analysis is a fine-grained task in sentiment analysis that aims at detecting sentiment polarity towards a specific target in a sentence. Previous studies focus on using global attention mechanism that attends to all words in the context to model the interaction between target and sentence. However, global attention suffers from assigning high-Attention score to irrelevant sentiment words in the cases where sentence contains noisy words or multiple targets. To address this problem, we propose a novel syntax-directed hybrid attention network (SHAN). In SHAN, a global attention is employed to capture coarse information about the target, and a syntax-directed local attention is used to take a look at words syntactically close to the target. An information gate is then utilized to synthesize the information from local and global attention results and adaptively generate a less-noisy and more sentiment-oriented representation. The experimental results on SemEval 2014 Datasets demonstrate the effectiveness of the proposed method.

Cite

CITATION STYLE

APA

Wang, X., Xu, G., Zhang, J., Sun, X., Wang, L., & Huang, T. (2019). Syntax-Directed Hybrid Attention Network for Aspect-Level Sentiment Analysis. IEEE Access, 7, 5014–5025. https://doi.org/10.1109/ACCESS.2018.2885032

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free