This work presents a modified Boosting algorithm capable of avoiding training sample overfitting during training procedures. The proposed algorithm updates weight distributions according to amount of misclassified samples at each iteration training step. Experimental tests reveal that our approach has several advantages over many classical AdaBoost algorithms in terms of error generalization capacity, overfitting avoidance and superior classification performance. © 2012 Springer-Verlag.
CITATION STYLE
Merjildo, D. A. F., & Ling, L. L. (2012). Enhancing the performance of AdaBoost algorithms by introducing a frequency counting factor for weight distribution updating. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 7441 LNCS, pp. 527–534). https://doi.org/10.1007/978-3-642-33275-3_65
Mendeley helps you to discover research relevant for your work.