2INER: Instructive and In-Context Learning on Few-Shot Named Entity Recognition

15Citations
Citations of this article
12Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Prompt-based learning has emerged as a powerful technique in natural language processing (NLP) due to its ability to leverage pre-training knowledge for downstream few-shot tasks. In this paper, we propose 2INER, a novel text-to-text framework for Few-Shot Named Entity Recognition (NER) tasks. Our approach employs instruction finetuning based on InstructionNER (Wang et al., 2022) to enable the model to effectively comprehend and process task-specific instructions, including both main and auxiliary tasks. We also introduce a new auxiliary task, called Type Extraction, to enhance the model's understanding of entity types in the overall semantic context of a sentence. To facilitate in-context learning, we concatenate examples to the input, enabling the model to learn from additional contextual information. Experimental results on four datasets demonstrate that our approach outperforms existing Few-Shot NER methods and remains competitive with state-of-the-art standard NER algorithms.

Cite

CITATION STYLE

APA

Zhang, J., Liu, X., Lai, X., Gao, Y., Wang, S., Hu, Y., & Lin, Y. (2023). 2INER: Instructive and In-Context Learning on Few-Shot Named Entity Recognition. In Findings of the Association for Computational Linguistics: EMNLP 2023 (pp. 3940–3951). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2023.findings-emnlp.259

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free