Distantly Supervised Relation Extraction with Sentence Reconstruction and Knowledge Base Priors

13Citations
Citations of this article
77Readers
Mendeley users who have this article in their library.

Abstract

We propose a multi-task, probabilistic approach to facilitate distantly supervised relation extraction by bringing closer the representations of sentences that contain the same Knowledge Base pairs. To achieve this, we bias the latent space of sentences via a Variational Autoencoder (VAE) that is trained jointly with a relation classifier. The latent code guides the pair representations and influences sentence reconstruction. Experimental results on two datasets created via distant supervision indicate that multi-task learning results in performance benefits. Additional exploration of employing Knowledge Base priors into the VAE reveals that the sentence space can be shifted towards that of the Knowledge Base, offering interpretability and further improving results.

Cite

CITATION STYLE

APA

Christopoulou, F., Miwa, M., & Ananiadou, S. (2021). Distantly Supervised Relation Extraction with Sentence Reconstruction and Knowledge Base Priors. In NAACL-HLT 2021 - 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Proceedings of the Conference (pp. 11–26). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2021.naacl-main.2

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free