Information theoretic combination of classifiers with application to AdaBoost

7Citations
Citations of this article
15Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Combining several classifiers has proved to be an efficient machine learning technique. We propose a new measure of the goodness of an ensemble of classifiers in an information theoretic framework. It measures a trade-off between diversty and individual classifier accuracy. This technique can be directly used for the selection of an ensemble in a pool of classifiers. We also propose a variant of AdaBoost for directly training the classifiers by taking into account this new information theoretic measure. © Springer-Verlag Berlin Heidelberg 2007.

Cite

CITATION STYLE

APA

Meynet, J., & Thiran, J. P. (2007). Information theoretic combination of classifiers with application to AdaBoost. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 4472 LNCS, pp. 171–179). Springer Verlag. https://doi.org/10.1007/978-3-540-72523-7_18

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free