Denoising is the essential step for distant supervision based named entity recognition. Previous denoising methods are mostly based on instance-level confidence statistics, which ignore the variety of the underlying noise distribution on different datasets and entity types. This makes them difficult to be adapted to high noise rate settings. In this paper, we propose Hypergeometric Learning (HGL), a denoising algorithm for distantly supervised NER that takes both noise distribution and instance-level confidence into consideration. Specifically, during neural network training, we naturally model the noise samples in each batch following a hypergeometric distribution parameterized by the noise-rate. Then each instance in the batch is regarded as either correct or noisy one according to its label confidence derived from previous training step, as well as the noise distribution in this sampled batch. Experiments show that HGL can effectively denoise the weaklylabeled data retrieved from distant supervision, and therefore results in significant improvements on the trained models.
CITATION STYLE
Zhang, W., Lin, H., Han, X., Sun, L., Liu, H., Wei, Z., & Yuan, N. J. (2021). Denoising Distantly Supervised Named Entity Recognition via a Hypergeometric Probabilistic Model. In 35th AAAI Conference on Artificial Intelligence, AAAI 2021 (Vol. 16, pp. 14481–14488). Association for the Advancement of Artificial Intelligence. https://doi.org/10.1609/aaai.v35i16.17702
Mendeley helps you to discover research relevant for your work.