Knowledge-aware and retrieval-based models for distantly supervised relation extraction

0Citations
Citations of this article
3Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Distantly supervised relation extraction (RE) has been an effective way to find novel relational facts from text without a large amount of well-labeled training data. However, distant supervision always suffers from wrong labelling problem. Many neural approaches have been proposed to alleviate this problem recently, but none of them can make use of the rich semantic knowledge in the knowledge bases (KBs). In this paper, we propose a knowledge-aware attention model, which can leverage the semantic knowledge in the KB to select the valid sentences. Furthermore, based on knowledge representation learning (KRL), we formalize distantly supervised RE as relation retrieval instead of relation classification to leverage the semantic knowledge further. Experimental results on widely used datasets show that our approaches significantly outperform the popular benchmark methods.

Cite

CITATION STYLE

APA

Zhang, X., Deng, K., Zhang, L., Tan, Z., & Liu, J. (2019). Knowledge-aware and retrieval-based models for distantly supervised relation extraction. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 11670 LNAI, pp. 148–161). Springer Verlag. https://doi.org/10.1007/978-3-030-29908-8_12

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free