When efficient model averaging out-performs boosting and bagging

20Citations
Citations of this article
33Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

The Bayes optimal classifier (BOC) is an ensemble technique used extensively in the statistics literature. However, compared to other ensemble techniques such as bagging and boosting, BOC is less known and rarely used in data mining. This is partly due to BOC being perceived as being inefficient and because bagging and boosting consistently outperforms a single model, which raises the question: "Do we even need BOC in datamining?". We show that the answer to this question is "yes" by illustrating several recent efficient model averaging approximations to BOC can significantly outperform bagging and boosting in realistic situations such as extensive class label noise, sample selection bias and many-class problems. That model averaging techniques outperform bagging and boosting in these situations has not been published in the machine learning, mining or statistical communities to our knowledge. © Springer-Verlag Berlin Heidelberg 2006.

Cite

CITATION STYLE

APA

Davidson, I., & Fan, W. (2006). When efficient model averaging out-performs boosting and bagging. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 4213 LNAI, pp. 478–486). Springer Verlag. https://doi.org/10.1007/11871637_46

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free