Active testing: An unbiased evaluation method for distantly supervised relation extraction

4Citations
Citations of this article
71Readers
Mendeley users who have this article in their library.

Abstract

Distant supervision has been a widely used method for neural relation extraction for its convenience of automatically labeling datasets. However, existing works on distantly supervised relation extraction suffer from the low quality of test set, which leads to considerable biased performance evaluation. These biases not only result in unfair evaluations but also mislead the optimization of neural relation extraction. To mitigate this problem, we propose a novel evaluation method named active testing through utilizing both the noisy test set and a few manual annotations. Experiments on a widely used benchmark show that our proposed approach can yield approximately unbiased evaluations for distantly supervised relation extractors.

Cite

CITATION STYLE

APA

Li, P., Zhang, X., Jia, W., & Zhao, W. (2020). Active testing: An unbiased evaluation method for distantly supervised relation extraction. In Findings of the Association for Computational Linguistics Findings of ACL: EMNLP 2020 (pp. 204–211). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2020.findings-emnlp.20

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free