How to make AdaBoost.M1 work for weak base classifiers by changing only one line of the code

20Citations
Citations of this article
25Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

If one has a multiclass classification problem and wants to boost a multiclass base classifier AdaBoost.M1 is a well known and widely applicated boosting algorithm. However AdaBoost.M1 does not work, if the base classifier is too weak. We show, that with a modification of only one line of AdaBoost.M1 one can make it usable for weak base classifiers, too. The resulting classifier AdaBoost.M1Wis guaranteed to minimize an upper bound for a performance measure, called the guessing error, as long as the base classifier is better than random guessing. The usability of AdaBoost.M1Wcou ld be clearly demonstrated experimentally. © Springer-Verlag Berlin Heidelberg 2002.

Cite

CITATION STYLE

APA

Eibl, G., & Pfeiffer, K. P. (2002). How to make AdaBoost.M1 work for weak base classifiers by changing only one line of the code. Lecture Notes in Computer Science (Including Subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 2430, 72–83. https://doi.org/10.1007/3-540-36755-1_7

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free