Distantly supervised neural network model for relation extraction

4Citations
Citations of this article
6Readers
Mendeley users who have this article in their library.
Get full text

Abstract

For the task of relation extraction, distant supervision is an efficient approach to generate labeled data by aligning knowledge base (KB) with free texts. Albeit easy to scale to thousands of different relations, this procedure suffers from introducing wrong labels because the relations in knowledge base may not be expressed by aligned sentences (mentions). In this paper, we propose a novel approach to alleviate the problem of distant supervision with representation learning in the framework of deep neural network. Our model - Distantly Supervised Neural Network (DSNN) - constructs the more powerful mention level representation by tensor-based transformation and further learns the entity pair level representation which aggregates and denoises the features of associated mentions. With this denoised representation, all of the relation labels can be jointly learned. Experimental results show that with minimal feature engineering, our model generally outperforms state-ofthe- art methods for distantly supervised relation extraction.

Cite

CITATION STYLE

APA

Wang, Z., Chang, B., & Sui, Z. (2015). Distantly supervised neural network model for relation extraction. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 9427, pp. 253–266). Springer Verlag. https://doi.org/10.1007/978-3-319-25816-4_21

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free