Incremental Few-shot Text Classification with Multi-round New Classes: Formulation, Dataset and System

30Citations
Citations of this article
83Readers
Mendeley users who have this article in their library.

Abstract

Text classification is usually studied by labeling natural language texts with relevant categories from a predefined set. In the real world, new classes might keep challenging the existing system with limited labeled data. The system should be intelligent enough to recognize upcoming new classes with a few examples. In this work, we define a new task in the NLP domain, incremental few-shot text classification, where the system incrementally handles multiple rounds of new classes. For each round, there is a batch of new classes with a few labeled examples per class. Two major challenges exist in this new task: (i) For the learning process, the system should incrementally learn new classes round by round without re-training on the examples of preceding classes; (ii) For the performance, the system should perform well on new classes without much loss on preceding classes. In addition to formulating the new task, we also release two benchmark datasets 1 in the incremental few-shot setting: intent classification and relation classification. Moreover, we propose two entailment approaches, ENTAILMENT and HYBRID, which show promise for solving this novel problem.

Cite

CITATION STYLE

APA

Xia, C., Yin, W., Feng, Y., & Yu, P. (2021). Incremental Few-shot Text Classification with Multi-round New Classes: Formulation, Dataset and System. In NAACL-HLT 2021 - 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Proceedings of the Conference (pp. 1351–1360). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2021.naacl-main.106

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free