Local negative correlation with resampling

2Citations
Citations of this article
1Readers
Mendeley users who have this article in their library.
Get full text

Abstract

This paper deals with a learning algorithm which combines two well known methods to generate ensemble diversity - error negative correlation and resampling. In this algorithm, a set of learners iteratively and synchronously improve their state considering information about the performance of a fixed number of other learners in the ensemble, to generate a sort of local negative correlation. Resampling allows the base algorithm to control the impact of highly influential data points which in turns can improve its generalization error. The resulting algorithm can be viewed as a generalization of bagging, where each learner no longer is independent but can be locally coupled with other learners. We will demonstrate our technique on two real data sets using neural networks ensembles. © Springer-Verlag Berlin Heidelberg 2006.

Cite

CITATION STYLE

APA

Ñanculef, R., Valle, C., Allende, H., & Moraga, C. (2006). Local negative correlation with resampling. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 4224 LNCS, pp. 570–577). Springer Verlag. https://doi.org/10.1007/11875581_69

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free