Rethinking Negative Sampling for Handling Missing Entity Annotations

15Citations
Citations of this article
43Readers
Mendeley users who have this article in their library.

Abstract

Negative sampling is highly effective in handling missing annotations for named entity recognition (NER). One of our contributions is an analysis on how it makes sense through introducing two insightful concepts: missampling and uncertainty. Empirical studies show low missampling rate and high uncertainty are both essential for achieving promising performances with negative sampling. Based on the sparsity of named entities, we also theoretically derive a lower bound for the probability of zero missampling rate, which is only relevant to sentence length. The other contribution is an adaptive and weighted sampling distribution that further improves negative sampling via our former analysis. Experiments on synthetic datasets and well-annotated datasets (e.g., CoNLL-2003) show that our proposed approach benefits negative sampling in terms of F1 score and loss convergence. Besides, models with improved negative sampling have achieved new state-of-the-art results on real-world datasets (e.g., EC).

Cite

CITATION STYLE

APA

Li, Y., Liu, L., & Shi, S. (2022). Rethinking Negative Sampling for Handling Missing Entity Annotations. In Proceedings of the Annual Meeting of the Association for Computational Linguistics (Vol. 1, pp. 7188–7197). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2022.acl-long.497

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free