CL3: Generalization of Contrastive Loss for Lifelong Learning

5Citations
Citations of this article
7Readers
Mendeley users who have this article in their library.

Abstract

Lifelong learning portrays learning gradually in nonstationary environments and emulates the process of human learning, which is efficient, robust, and able to learn new concepts incrementally from sequential experience. To equip neural networks with such a capability, one needs to overcome the problem of catastrophic forgetting, the phenomenon of forgetting past knowledge while learning new concepts. In this work, we propose a novel knowledge distillation algorithm that makes use of contrastive learning to help a neural network to preserve its past knowledge while learning from a series of tasks. Our proposed generalized form of contrastive distillation strategy tackles catastrophic forgetting of old knowledge, and minimizes semantic drift by maintaining a similar embedding space, as well as ensures compactness in feature distribution to accommodate novel tasks in a current model. Our comprehensive study shows that our method achieves improved performances in the challenging class-incremental, task-incremental, and domain-incremental learning for supervised scenarios.

Cite

CITATION STYLE

APA

Roy, K., Simon, C., Moghadam, P., & Harandi, M. (2023). CL3: Generalization of Contrastive Loss for Lifelong Learning. Journal of Imaging, 9(12). https://doi.org/10.3390/jimaging9120259

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free