The diversity of regression ensembles combining bagging and random subspace method

9Citations
Citations of this article
5Readers
Mendeley users who have this article in their library.
Get full text

Abstract

The concept of Ensemble Learning has been shown to increase predictive power over single base learners. Given the bias-variance-covariance decomposition, diversity is characteristic factor, since ensemble error decreases as diversity increases. In this study, we apply Bagging and Random Subspace Method (RSM) to ensembles of Local Linear Map (LLM)-type, which achieve non-linearity through local linear approximation, supplied with different vector quantization algorithms. The results are compared for several benchmark data sets to those of RandomForest and neural networks. We can show which parameters are of major influence on diversity in ensembles and that using our proposed method of LLM combining RSM we are able to achieve results obtained by other reference ensemble architectures. © 2009 Springer Berlin Heidelberg.

Cite

CITATION STYLE

APA

Scherbart, A., & Nattkemper, T. W. (2009). The diversity of regression ensembles combining bagging and random subspace method. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 5507 LNCS, pp. 911–918). https://doi.org/10.1007/978-3-642-03040-6_111

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free