Word embeddings-based sentence-level sentiment analysis considering word importance

25Citations
Citations of this article
49Readers
Mendeley users who have this article in their library.

Abstract

Word2vec has been proven to facilitate various Natural Language Processing (NLP) tasks. We suppose that it could separate the vector space of word2vec into positive and negative. Hence, word2vec can be applied to Sentiment Analysis tasks. In our previous research, we proposed the word embeddings (WEMB) based Sentence-level Sentiment Analysis method. Word’s vectors from WEMB are utilized to calculate the sentence vector. Training of the classification model is done using sentence vector and the polarity. After training, the model predicts the polarity of the unlabeled sentence. However, the sentence vector was insufficient because the method treats all words with the same weight for calculating a sentence vector. In this paper, we propose a method to solve this problem. We consider word weight according to their importance for calculating sentence vector. The proposed method is compared with the method without word importance, and the accuracy is improved. However, there is still a grim difference with state of the art. We discuss the next improvement and present future work.

Cite

CITATION STYLE

APA

Hayashi, T., & Fujita, H. (2019). Word embeddings-based sentence-level sentiment analysis considering word importance. Acta Polytechnica Hungarica, 16(7), 7–24. https://doi.org/10.12700/APH.16.7.2019.7.1

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free