Neural relation classification using selective attention and symmetrical directional instances

4Citations
Citations of this article
8Readers
Mendeley users who have this article in their library.

Abstract

Relation classification (RC) is an important task in information extraction from unstructured text. Recently, several neural methods based on various network architectures have been adopted for the task of RC. Among them, convolution neural network (CNN)-based models stand out due to their simple structure, low model complexity and "good" performance. Nevertheless, there are still at least two limitations associated with existing CNN-based RC models. First, when handling samples with long distances between entities, they fail to extract effective features, even obtaining disturbing ones from the clauses, which results in decreased accuracy. Second, existing RC models tend to produce inconsistent results when fed with forward and backward instances of an identical sample. Therefore, we present a novel CNN-based sentence encoder with selective attention by leveraging the shortest dependency paths, and devise a classification framework using symmetrical directional-forward and backward-instances via information fusion. Comprehensive experiments verify the superior performance of the proposed RC model over mainstream competitors without additional artificial features.

Cite

CITATION STYLE

APA

Tan, Z., Li, B., Huang, P., Ge, B., & Xiao, W. (2018). Neural relation classification using selective attention and symmetrical directional instances. Symmetry, 10(9). https://doi.org/10.3390/sym10090357

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free