Ensemble Classifiers: AdaBoost and Orthogonal Evolution of Teams

  • Soule T
  • Heckendorn R
  • Dyre B
  • et al.
N/ACitations
Citations of this article
15Readers
Mendeley users who have this article in their library.
Get full text

Abstract

AdaBoost is one of the most commonly used and most successful approaches for generating ensemble classifiers. However, AdaBoost is limited in that it requires independent training cases and can only use voting as a cooperation mechanism. This paper compares AdaBoost to Orthogonal Evolution of Teams (OET), an approach for generating ensembles that allows for a much wider range of problems and cooperation mechanisms. The set of test problems includes problems with significant amounts of noise in the form of erroneous training cases and problems with adjustable levels of epistasis. The results demonstrate that OET is a suitable alternative to AdaBoost for generating ensembles. Over the set of all tested problems OET with a hierarchical cooperation mechanism, rather than voting, is slightly more likely to produce better results. This is most apparent on the problems with very high levels of noise - suggesting that the hierarchical approach is less subject to over-fitting than voting techniques. The results also suggest that there are specific problems and features of problems that make thembetter suited for different training algorithms and different cooperation mechanisms.

Cite

CITATION STYLE

APA

Soule, T., Heckendorn, R. B., Dyre, B., & Lew, R. (2011). Ensemble Classifiers: AdaBoost and Orthogonal Evolution of Teams (pp. 55–69). https://doi.org/10.1007/978-1-4419-7747-2_4

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free