Pre-training entity relation encoder with intra-span and inter-span information

43Citations
Citations of this article
103Readers
Mendeley users who have this article in their library.

Abstract

In this paper, we integrate span-related information into pre-trained encoder for entity relation extraction task. Instead of using general-purpose sentence encoder (e.g., existing universal pre-trained models), we introduce a span encoder and a span pair encoder to the pre-training network, which makes it easier to import intra-span and inter-span information into the pre-trained model. To learn the encoders, we devise three customized pre-training objectives from different perspectives, which target on tokens, spans, and span pairs. In particular, a span encoder is trained to recover a random shuffling of tokens in a span, and a span pair encoder is trained to predict positive pairs that are from the same sentences and negative pairs that are from different sentences using contrastive loss. Experimental results show that the proposed pre-training method outperforms distantly supervised pre-training, and achieves promising performance on two entity relation extraction benchmark datasets (ACE05, SciERC).

Cite

CITATION STYLE

APA

Wang, Y., Sun, C., Wu, Y., Yan, J., Gao, P., & Xie, G. (2020). Pre-training entity relation encoder with intra-span and inter-span information. In EMNLP 2020 - 2020 Conference on Empirical Methods in Natural Language Processing, Proceedings of the Conference (pp. 1692–1705). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2020.emnlp-main.132

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free