Weakly supervised methods estimate the labels for a dataset using the predictions of several noisy supervision sources. Many machine learning practitioners have begun using weak supervision to more quickly and cheaply annotate data compared to traditional manual labeling. In this paper, we focus on the specific problem of weakly supervised named entity recognition (NER) and propose an end-to-end model to learn optimal assignments of latent NER tags using observed tokens and weak labels provided by labeling functions. To capture the sequential dependencies between the latent and observed variables, we propose a sequential graphical model where the components are approximated using neural networks. State-of-the-art contextual embeddings are used to further discriminate the quality of noisy weak labels in various contexts. Results of experiments on four public weakly supervised named entity recognition datasets show a significant improvement in F1 score over recent approaches.
CITATION STYLE
Parker, J., & Yu, S. (2021). Named Entity Recognition through Deep Representation Learning and Weak Supervision. In Findings of the Association for Computational Linguistics: ACL-IJCNLP 2021 (pp. 3828–3839). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2021.findings-acl.335
Mendeley helps you to discover research relevant for your work.