A transfer learning based hierarchical attention neural network for sentiment classification

2Citations
Citations of this article
22Readers
Mendeley users who have this article in their library.
Get full text

Abstract

The purpose of document-level sentiment classification in social network is to predict users’ sentiment expressed in the document. Traditional methods based on deep neural networks rely on unsupervised word vectors. However, the word vectors cannot exactly represent the contextual relationship of context. On the other hand, Recurrent Neural Networks (RNNs) generally used to process the sentiment classification problem have complex structures, numerous model parameters and RNNs are hard to train. To address above issues, we propose a Transfer Learning based Hierarchical Attention Neural Network (TLHANN). Firstly, we train an encoder to understand in the context with machine translation task. Secondly, we transfer the encoder to sentiment classification task by concatenating the hidden vector generated by the encoder with the corresponding unsupervised vector. Finally, for the sentiment classification task, we apply a two-level hierarchical network. A simplified RNN unit called the Minimal Gate Unit (MGU) is arranged at each level. We use the attention mechanism at each level. Experimental results on several datasets show that the TLHANN model has excellent performance.

Cite

CITATION STYLE

APA

Qu, Z., Wang, Y., Wang, X., & Zheng, S. (2018). A transfer learning based hierarchical attention neural network for sentiment classification. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 10943 LNCS, pp. 383–392). Springer Verlag. https://doi.org/10.1007/978-3-319-93803-5_36

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free