A Hierarchical Model with Recurrent Convolutional Neural Networks for Sequential Sentence Classification

8Citations
Citations of this article
6Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Hierarchical neural networks approaches have achieved outstanding results in the latest sequential sentence classification research work. However, it is challenging for the model to consider both the local invariant features and word dependent information of the sentence. In this work, we concentrate on the sentence representation and context modeling components that influence the effects of the hierarchical architecture. We present a new approach called SR-RCNN to generate more precise sentence encoding which leverage complementary strength of bi-directional recurrent neural network and text convolutional neural network to capture contextual and literal relevance information. Afterwards, statement-level encoding vectors are modeled to capture the intrinsic relations within surrounding sentences. In addition, we explore the applicability of attention mechanisms and conditional random fields to the task. Our model advances sequential sentence classification in medical abstracts to new state-of-the-art performance.

Cite

CITATION STYLE

APA

Jiang, X., Zhang, B., Ye, Y., & Liu, Z. (2019). A Hierarchical Model with Recurrent Convolutional Neural Networks for Sequential Sentence Classification. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 11839 LNAI, pp. 78–89). Springer. https://doi.org/10.1007/978-3-030-32236-6_7

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free