Recurrent neural network for text classification with hierarchical multiscale dense connections

32Citations
Citations of this article
49Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Text classification is a fundamental task in many Natural Language Processing applications. While recurrent neural networks have achieved great success in performing text classification, they fail to capture the hierarchical structure and long-term semantics dependency which are common features of text data. Inspired by the advent of the dense connection pattern in advanced convolutional neural networks, we propose a simple yet effective recurrent architecture, named Hierarchical Mutiscale Densely Connected RNNs (HM-DenseRNNs), which: 1) enables direct access to the hidden states of all preceding recurrent units via dense connections, and 2) organizes multiple densely connected recurrent units into a hierarchical multiscale structure, where the layers are updated at different scales. HM-DenseRNNs can effectively capture long-term dependencies among words in long text data, and a dense recurrent block is further introduced to reduce the number of parameters and enhance training efficiency. We evaluate the performance of our proposed architecture on three text datasets and the results verify the advantages of HM-DenseRNN over the baseline methods in terms of the classification accuracy.

Cite

CITATION STYLE

APA

Zhao, Y., Shen, Y., & Yao, J. (2019). Recurrent neural network for text classification with hierarchical multiscale dense connections. In IJCAI International Joint Conference on Artificial Intelligence (Vol. 2019-August, pp. 5450–5456). International Joint Conferences on Artificial Intelligence. https://doi.org/10.24963/ijcai.2019/757

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free