A broad neural network structure for class incremental learning

6Citations
Citations of this article
13Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Class Incremental Learning, learning concepts over time, is a promising research topic. Due to unknowing the number of output classes, researchers have to develop different methods to model new classes while preserving pre-trained performance. However, they will meet the catastrophic forgetting problem. That is, the performance will be deteriorated when updating the pre-trained model using new class data without including old data. Hence, in this paper, we propose a novel learning framework, namely Broad Class Incremental Learning System (BCILS) to tackle the above issue. The BCILS updates the model when there are training data from unknown classes by using the deduced iterative formula. This is different from most of the existing fine-tuning based class incremental learning algorithms. The advantages of the proposed approach including (1) easy to model; (2) flexible structure; (3) pre-trained performance preserved well. Finally, we conduct extensive experiments to demonstrate the superiority of the proposed BCILS.

Cite

CITATION STYLE

APA

Liu, W., Yang, H., Sun, Y., & Sun, C. (2018). A broad neural network structure for class incremental learning. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 10878 LNCS, pp. 229–238). Springer Verlag. https://doi.org/10.1007/978-3-319-92537-0_27

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free