PARE: A Simple and Strong Baseline for Monolingual and Multilingual Distantly Supervised Relation Extraction

31Citations
Citations of this article
43Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Neural models for distantly supervised relation extraction (DS-RE) encode each sentence in an entity-pair bag separately. These are then aggregated for bag-level relation prediction. Since, at encoding time, these approaches do not allow information to flow from other sentences in the bag, we believe that they do not utilize the available bag data to the fullest. In response, we explore a simple baseline approach (PARE) in which all sentences of a bag are concatenated into a passage of sentences, and encoded jointly using BERT. The contextual embeddings of tokens are aggregated using attention with the candidate relation as query - this summary of whole passage predicts the candidate relation. We find that our simple baseline solution outperforms existing state-of-the-art DS-RE models in both monolingual and multilingual DS-RE datasets.

Cite

CITATION STYLE

APA

Rathore, V., Badola, K., Singla, P., & Mausam. (2022). PARE: A Simple and Strong Baseline for Monolingual and Multilingual Distantly Supervised Relation Extraction. In Proceedings of the Annual Meeting of the Association for Computational Linguistics (Vol. 2, pp. 340–354). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2022.acl-short.38

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free