Contrastive Continual Learning with Importance Sampling and Prototype-Instance Relation Distillation

12Citations
Citations of this article
6Readers
Mendeley users who have this article in their library.

Abstract

Recently, because of the high-quality representations of contrastive learning methods, rehearsal-based contrastive continual learning has been proposed to explore how to continually learn transferable representation embeddings to avoid the catastrophic forgetting issue in traditional continual settings. Based on this framework, we propose Contrastive Continual Learning via Importance Sampling (CCLIS) to preserve knowledge by recovering previous data distributions with a new strategy for Replay Buffer Selection (RBS), which minimize estimated variance to save hard negative samples for representation learning with high quality. Furthermore, we present the Prototype-instance Relation Distillation (PRD) loss, a technique designed to maintain the relationship between prototypes and sample representations using a self-distillation process. Experiments on standard continual learning benchmarks reveal that our method notably outperforms existing baselines in terms of knowledge preservation and thereby effectively counteracts catastrophic forgetting in online contexts. The code is available at https://github.com/lijy373/CCLIS.

Cite

CITATION STYLE

APA

Li, J., Azizov, D., Li, Y., & Liang, S. (2024). Contrastive Continual Learning with Importance Sampling and Prototype-Instance Relation Distillation. In Proceedings of the AAAI Conference on Artificial Intelligence (Vol. 38, pp. 13554–13562). Association for the Advancement of Artificial Intelligence. https://doi.org/10.1609/aaai.v38i12.29259

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free