Class-Incremental Learning based on Label Generation

4Citations
Citations of this article
15Readers
Mendeley users who have this article in their library.

Abstract

Despite the great success of pre-trained language models, it is still a challenge to use these models for continual learning, especially for the class-incremental learning (CIL) setting due to catastrophic forgetting (CF). This paper reports our finding that if we formulate CIL as a continual label generation problem, CF is drastically reduced and the generalizable representations of pre-trained models can be better retained. We thus propose a new CIL method (VAG) that also leverages the sparsity of vocabulary to focus the generation and creates pseudo-replay samples by using label semantics. Experimental results show that VAG outperforms baselines by a large margin.

Cite

CITATION STYLE

APA

Shao, Y., Guo, Y., Zhao, D., & Liu, B. (2023). Class-Incremental Learning based on Label Generation. In Proceedings of the Annual Meeting of the Association for Computational Linguistics (Vol. 2, pp. 1263–1276). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2023.acl-short.109

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free