Improving distantly-supervised relation extraction with joint label embedding

51Citations
Citations of this article
156Readers
Mendeley users who have this article in their library.

Abstract

Distantly-supervised relation extraction has proven to be effective to find relational facts from texts. However, the existing approaches treat labels as independent and meaningless one-hot vectors, which cause a loss of potential label information for selecting valid instances. In this paper, we propose a novel multi-layer attention-based model to improve relation extraction with joint label embedding. The model makes full use of both structural information from Knowledge Graphs and textual information from entity descriptions to learn label embeddings through gating integration, while avoiding the imposed noise with an attention mechanism. Then the learned label embeddings are used as another attention over the instances (whose embeddings are also enhanced with the entity descriptions) for improving relation extraction. Extensive experiments demonstrate that our model significantly outperforms state-of-the-art methods.

Cite

CITATION STYLE

APA

Hu, L., Zhang, L., Shi, C., Nie, L., Guan, W., & Yang, C. (2019). Improving distantly-supervised relation extraction with joint label embedding. In EMNLP-IJCNLP 2019 - 2019 Conference on Empirical Methods in Natural Language Processing and 9th International Joint Conference on Natural Language Processing, Proceedings of the Conference (pp. 3821–3829). Association for Computational Linguistics. https://doi.org/10.18653/v1/d19-1395

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free