Empower Distantly Supervised Relation Extraction with Collaborative Adversarial Training

17Citations
Citations of this article
26Readers
Mendeley users who have this article in their library.

Abstract

With recent advances in distantly supervised (DS) relation extraction (RE), considerable attention is attracted to leverage multi-instance learning (MIL) to distill high-quality supervision from the noisy DS. Here, we go beyond label noise and identify the key bottleneck of DS-MIL to be its low data utilization: as high-quality supervision being refined by MIL, MIL abandons a large amount of training instances, which leads to a low data utilization and hinders model training from having abundant supervision. In this paper, we propose collaborative adversarial training to improve the data utilization, which coordinates virtual adversarial training (VAT) and adversarial training (AT) at different levels. Specifically, since VAT is label-free, we employ the instance-level VAT to recycle instances abandoned by MIL. Besides, we deploy AT at the bag-level to unleash the full potential of the high-quality supervision got by MIL. Our proposed method brings consistent improvements (~ 5 absolute AUC score) to the previous state of the art, which verifies the importance of the data utilization issue and the effectiveness of our method.

Cite

CITATION STYLE

APA

Chen, T., Shi, H., Liu, L., Tang, S., Shao, J., Chen, Z., & Zhuang, Y. (2021). Empower Distantly Supervised Relation Extraction with Collaborative Adversarial Training. In 35th AAAI Conference on Artificial Intelligence, AAAI 2021 (Vol. 14A, pp. 12675–12682). Association for the Advancement of Artificial Intelligence. https://doi.org/10.1609/aaai.v35i14.17501

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free