Model selection method for adaboost using formal information criteria

1Citations
Citations of this article
4Readers
Mendeley users who have this article in their library.
Get full text

Abstract

AdaBoost is being used widely in information systems, artificial intelligence and bioinformatics, because it provides an efficient function approximation method. However, AdaBoost does not employ either the maximum likelihood method or Bayes estimation, and hence its generalized performance is not yet known. Therefore an optimization method for the minimum generalization error has not yet been established. In this paper, we propose a new method to select an optimal model using formal information criteria, AIC and BIC. Although neither AIC nor BIC theoretically corresponds to the generalization error in AdaBoost, we show experimentally that an optimal model can be chosen by formal AIC and BIC. © 2009 Springer Berlin Heidelberg.

Cite

CITATION STYLE

APA

Kaji, D., & Watanabe, S. (2009). Model selection method for adaboost using formal information criteria. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 5507 LNCS, pp. 903–910). https://doi.org/10.1007/978-3-642-03040-6_110

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free