A Deep Recursive Neural Network Model for Fine-Grained Opinion Classification

2Citations
Citations of this article
5Readers
Mendeley users who have this article in their library.
Get full text

Abstract

In recent times, deep neural networks (DNN) have acquired greater significance in providing solutions to many deep learning tasks. Particularly, recursive neural networks (RNN) have been efficiently utilized in exploring semantic compositions for natural language content represented with structured formats (e.g. parse-trees). Despite the fact that RNN are deep in structure, yet they fail to exhibit hierarchical representations observed in traditional deep feed-forward networks (DFNN) and also in revolutionary deep recurrent neural networks (DRcNN). However, the notion of depth can be incorporated through stacking multiple recursive layers, which results in deep recursive neural networks (DRNN). On the other hand, enhanced word spaces offer added benefits in capturing fine-grained semantic regularities. In this paper, we address the problem of fine-grained opinion classification using DRNN and word embeddings. Furthermore, the efficiency of DRNN model is estimated through the conduction of a series of experiments over several opinion datasets. The results report that the proposed DRNN architecture achieves better prediction rate for fine-grained classification when compared with conventional shallow counterparts that employ similar parameters.

Cite

CITATION STYLE

APA

Wadawadagi, R. S., & Pagi, V. B. (2019). A Deep Recursive Neural Network Model for Fine-Grained Opinion Classification. In Communications in Computer and Information Science (Vol. 1037, pp. 607–621). Springer Verlag. https://doi.org/10.1007/978-981-13-9187-3_54

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free