NAG-NER: a Unifed Non-Autoregressive Generation Framework for Various NER Tasks

5Citations
Citations of this article
8Readers
Mendeley users who have this article in their library.

Abstract

Recently, the recognition of fat, nested, and discontinuous entities by a unifed generative model framework has received increasing attention both in the research feld and industry. However, the current generative NER methods force the entities to be generated in a predefned order, suffering from error propagation and in-effcient decoding. In this work, we propose a unifed non-autoregressive generation (NAG) framework for general NER tasks, referred to as NAG-NER. First, we propose to generate entities as a set instead of a sequence, avoiding error propagation. Second, we propose incorporating NAG in NER tasks for effcient decoding by treating each entity as a target sequence. Third, to enhance the generation performances of the NAG decoder, we employ the NAG encoder to detect potential entity mentions. Extensive experiments show that our NAG-NER model outperforms the state-of-the-art generative NER models on three benchmark NER datasets of different types and two of our proprietary NER tasks.

Cite

CITATION STYLE

APA

Zhang, X., Tan, M., Zhang, J., & Zhu, W. (2023). NAG-NER: a Unifed Non-Autoregressive Generation Framework for Various NER Tasks. In Proceedings of the Annual Meeting of the Association for Computational Linguistics (Vol. 5, pp. 676–686). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2023.acl-industry.65

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free