Global bootstrapping neural network for entity set expansion

5Citations
Citations of this article
68Readers
Mendeley users who have this article in their library.

Abstract

Bootstrapping for entity set expansion (ESE) has been studied for a long period, which expands new entities using only a few seed entities as supervision. Recent end-to-end bootstrapping approaches have shown their advantages in information capturing and bootstrapping process modeling. However, due to the sparse supervision problem, previous end-to-end methods often only leverage information from near neighborhoods (local semantics) rather than those propagated from the co-occurrence structure of the whole corpus (global semantics). To address this issue, this paper proposes Global Bootstrapping Network (GBN) with the “pre-training and fine-tuning” strategies for effective learning. Specifically, it contains a global-sighted encoder to capture and encode both local and global semantics into entity embedding, and an attention-guided decoder to sequentially expand new entities based on these embeddings. The experimental results show that the GBN learned by “pre-training and fine-tuning” strategies achieves state-of-the-art performance on two bootstrapping datasets.

Cite

CITATION STYLE

APA

Yan, L., Han, X., He, B., & Sun, L. (2020). Global bootstrapping neural network for entity set expansion. In Findings of the Association for Computational Linguistics Findings of ACL: EMNLP 2020 (pp. 3705–3714). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2020.findings-emnlp.331

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free