Improving ADA-boost as a popular ensemble in classification problems

0Citations
Citations of this article
1Readers
Mendeley users who have this article in their library.
Get full text

Abstract

In Data Mining, several classification algorithms are used to perform classification based on single learner but classification accuracy is not in an effective manner. To increase the accuracy of the classification then multiple learners are combined to get better results. The multiple learners are trained and combined into an ensemble. The ensemble can increase generalization ability and robustness [3]. Based on the advantage of ensemble, the ensemble classification is a major concern in research directions of machine learning. Another importance of ensemble is that it is much stronger than single base learner to produce accurate hypothesis. The ensembles are divided into homogeneous or heterogeneous, dependent or independent ensembles. The dependent ensemble methods like boosting and AdaBoost algorithms are promisingly provide an accurate hypothesis. Finally, AdaBoost can be a better classifier ensemble to generate accurate results.

Cite

CITATION STYLE

APA

Reddy, M. S. K., Kumar, K. E. N., & Rajput, D. S. (2019). Improving ADA-boost as a popular ensemble in classification problems. International Journal of Innovative Technology and Exploring Engineering, 8(9 Special Issue 3), 241–243. https://doi.org/10.35940/ijitee.I3043.0789S319

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free