Totally-corrective multi-class boosting

3Citations
Citations of this article
2Readers
Mendeley users who have this article in their library.
Get full text

Abstract

We proffer totally-corrective multi-class boosting algorithms in this work. First, we discuss the methods that extend two-class boosting to multi-class case by studying two existing boosting algorithms: AdaBoost.MO and SAMME, and formulate convex optimization problems that minimize their regularized cost functions. Then we propose a column-generation based totally-corrective framework for multi-class boosting learning by looking at the Lagrange dual problems. Experimental results on UCI datasets show that the new algorithms have comparable generalization capability but converge much faster than their counterparts. Experiments on MNIST handwriting digit classification also demonstrate the effectiveness of the proposed algorithms. © 2011 Springer-Verlag Berlin Heidelberg.

Cite

CITATION STYLE

APA

Hao, Z., Shen, C., Barnes, N., & Wang, B. (2011). Totally-corrective multi-class boosting. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 6495 LNCS, pp. 269–280). https://doi.org/10.1007/978-3-642-19282-1_22

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free