Keep and learn: Continual learning by constraining the latent space for knowledge preservation in neural networks

13Citations
Citations of this article
60Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

Data is one of the most important factors in machine learning. However, even if we have high-quality data, there is a situation in which access to the data is restricted. For example, access to the medical data from outside is strictly limited due to the privacy issues. In this case, we have to learn a model sequentially only with the data accessible in the corresponding stage. In this work, we propose a new method for preserving learned knowledge by modeling the high-level feature space and the output space to be mutually informative, and constraining feature vectors to lie in the modeled space during training. The proposed method is easy to implement as it can be applied by simply adding a reconstruction loss to an objective function. We evaluate the proposed method on CIFAR-10/100 and a chest X-ray dataset, and show benefits in terms of knowledge preservation compared to previous approaches.

Cite

CITATION STYLE

APA

Kim, H. E., Kim, S., & Lee, J. (2018). Keep and learn: Continual learning by constraining the latent space for knowledge preservation in neural networks. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 11070 LNCS, pp. 520–528). Springer Verlag. https://doi.org/10.1007/978-3-030-00928-1_59

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free