Hierarchical attention CNN and entity-aware for relation extraction

0Citations
Citations of this article
1Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Convolution neural network is a widely used model in the relation extraction (RE) task. Previous work simply uses max pooling to select features, which cannot preserve the position information and deal with the long sentences. In addition, the critical information for relation classification tends to present in a certain segment. A better method to extract feature in segment level is needed. In this paper, we propose a novel model with hierarchical attention, which can capture both local syntactic features and global structural features. A position-aware attention pooling is designed to calculate the importance of convolution features and capture the fine-grained information. A segment-level self-attention is used to capture the most important segment in the sentence. We also use the skills of entity-mask and entity-aware to make our model focus on different aspects of information at different stages. Experiments show that the proposed method can accurately capture the key information in sentences and greatly improve the performance of relation classification comparing to state-of-the-art methods.

Cite

CITATION STYLE

APA

Zhu, X., Liu, G., & Su, B. (2019). Hierarchical attention CNN and entity-aware for relation extraction. In Communications in Computer and Information Science (Vol. 1142 CCIS, pp. 87–94). Springer. https://doi.org/10.1007/978-3-030-36808-1_10

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free