Adaptive Boosting (Adaboost) is one of the most known methods to build an ensemble of neural networks. In this paper we briefly analyze and mix two of the most important variants of Adaboost, Averaged Boosting and Conservative Boosting, in order to build a robuster ensemble of neural networks. The mixed method called Averaged Conservative Boosting (ACB) applies the conservative equation used in Conserboost along with the averaged procedure used in Aveboost in order to update the sampling distribution. We have tested the methods with seven databases from the UCI repository. The results show that Averaged Conservative Boosting is the best performing method. © Springer-Verlag Berlin Heidelberg 2007.
CITATION STYLE
Torres-Sospedra, J., Hernández-Espinosa, C., & Fernández-Redondo, M. (2007). Improving adaptive boosting with a relaxed equation to update the sampling distribution. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 4507 LNCS, pp. 119–126). Springer Verlag. https://doi.org/10.1007/978-3-540-73007-1_15
Mendeley helps you to discover research relevant for your work.