Trimmed bagging

43Citations
Citations of this article
17Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Bagging has been found to be successful in increasing the predictive performance of unstable classifiers. Bagging draws bootstrap samples from the training sample, applies the classifier to each bootstrap sample, and then averages over all obtained classification rules. The idea of trimmed bagging is to exclude the bootstrapped classification rules that yield the highest error rates, as estimated by the out-of-bag error rate, and to aggregate over the remaining ones. In this note we explore the potential benefits of trimmed bagging. On the basis of numerical experiments, we conclude that trimmed bagging performs comparably to standard bagging when applied to unstable classifiers as decision trees, but yields better results when applied to more stable base classifiers, like support vector machines. © 2007 Elsevier B.V. All rights reserved.

Cite

CITATION STYLE

APA

Croux, C., Joossens, K., & Lemmens, A. (2007). Trimmed bagging. Computational Statistics and Data Analysis, 52(1), 362–368. https://doi.org/10.1016/j.csda.2007.06.012

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free