ODDboost: Incorporating posterior estimates into adaboost

0Citations
Citations of this article
1Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Boosting methods while being among the best classification methods developed so far, are known to degrade performance in case of noisy data and overlapping classes. In this paper we propose a new upper generalization bound for weighted averages of hypotheses, which uses posterior estimates for training objects and is based on reduction of binary classification problem with overlapping classes to a deterministic problem. If we are given accurate posterior estimates, proposed bound is lower than existing bound by Schapire et al [25]. We design an AdaBoost-like algorithm which optimizes proposed generalization bound and show that incorporated with good posterior estimates it performs better than the standard AdaBoost on real-world data sets. © 2009 Springer Berlin Heidelberg.

Cite

CITATION STYLE

APA

Barinova, O., & Vetrov, D. (2009). ODDboost: Incorporating posterior estimates into adaboost. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 5632 LNAI, pp. 178–190). https://doi.org/10.1007/978-3-642-03070-3_14

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free