Improving Distantly Supervised Relation Extraction with Self-Ensemble Noise Filtering

5Citations
Citations of this article
49Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Distantly supervised models are very popular for relation extraction since we can obtain a large amount of training data using the distant supervision method without human annotation. In distant supervision, a sentence is considered as a source of a tuple if the sentence contains both entities of the tuple. However, this condition is too permissive and does not guarantee the presence of relevant relation-specific information in the sentence. As such, distantly supervised training data contains much noise which adversely affects the performance of the models. In this paper, we propose a self-ensemble filtering mechanism to filter out the noisy samples during the training process. We evaluate our proposed framework on the New York Times dataset which is obtained via distant supervision. Our experiments with multiple state-of-the-art neural relation extraction models show that our proposed filtering mechanism improves the robustness of the models and increases their F1 scores.

Cite

CITATION STYLE

APA

Nayak, T., Majumder, N., & Poria, S. (2021). Improving Distantly Supervised Relation Extraction with Self-Ensemble Noise Filtering. In International Conference Recent Advances in Natural Language Processing, RANLP (pp. 1031–1039). Incoma Ltd. https://doi.org/10.26615/978-954-452-072-4_116

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free