Multi-Gram CNN-Based Self-Attention Model for Relation Classification

26Citations
Citations of this article
32Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

Relation classification is a crucial ingredient in numerous information-extraction systems and has attracted a great deal of attention in recent years. Traditional approaches largely rely on feature engineering and suffer from the limitations of the domain adaption and the error propagation. To overcome the above-mentioned problems, many deep neural network-based methods have been proposed; however, these methods cannot effectively locate and utilize the relation trigger features. To locate the relation trigger features and make full use of them, we propose a novel multi-gram convolution neural network-based self-attention model with a recurrent neural network framework. The multi-gram conventional neural network attention model can learn the adaptive relational semantics of inputs based on the fact that a relation can be totally defined by the shortest dependency path between its two entities. With the learned relational semantics, we can obtain the corresponding importance distribution over input sentences and locate the relation trigger features. For effective information propagation and integration, we utilize a bidirectional gated recurrent unit to encode the high-level features during recurrent propagation. The experimental results on two benchmark datasets demonstrate that the proposed model outperforms most of the state-of-the-art models.

Cite

CITATION STYLE

APA

Zhang, C., Cui, C., Gao, S., Nie, X., Xu, W., Yang, L., … Yin, Y. (2019). Multi-Gram CNN-Based Self-Attention Model for Relation Classification. IEEE Access, 7, 5343–5357. https://doi.org/10.1109/ACCESS.2018.2888508

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free