Incremental learning by heterogeneous Bagging ensemble

26Citations
Citations of this article
12Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Classifier ensemble is a main direction of incremental learning researches, and many ensemble-based incremental learning methods have been presented. Among them, Learn++, which is derived from the famous ensemble algorithm, AdaBoost, is special. Learn++ can work with any type of classifiers, either they are specially designed for incremental learning or not, this makes Learn++ potentially supports heterogeneous base classifiers. Based on massive experiments we analyze the advantages and disadvantages of Learn++. Then a new ensemble incremental learning method, Bagging++, is presented, which is based on another famous ensemble method: Bagging. The experimental results show that Bagging ensemble is a promising method for incremental learning and heterogeneous Bagging++ has the better generalization and learning speed than other compared methods such as Learn++ and NCL. © 2010 Springer-Verlag.

Cite

CITATION STYLE

APA

Zhao, Q. L., Jiang, Y. H., & Xu, M. (2010). Incremental learning by heterogeneous Bagging ensemble. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 6441 LNAI, pp. 1–12). https://doi.org/10.1007/978-3-642-17313-4_1

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free