An Improved AdaBoost Algorithm for Hyperparameter Optimization

6Citations
Citations of this article
23Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

AdaBoost algorithm is a typical Boosting algorithm, which belongs to a successful representative in the Boosting family. This algorithm can upgrade a weak classifier with a better classification effect than random classification to a strong classifier with high classification accuracy, where n_estimators represents the number of iterations of the base classifier. If the value is too large, it will easily cause the model to overfit, if it is too small, it is easy. The model is under-fitting, and the parameter setting is not set randomly, but according to the current status of the data set. Aiming at the problem that the number of iterations in the AdaBoost algorithm is uncertain, this paper introduces a Bayesian optimization algorithm for hyperparameter tuning, which makes the value of hyper parameter in AdaBoost algorithm suitable for the current data set, and finally obtains a hyperparameter optimization AdaBoost algorithm. The experiment result shows the method that adopt Bayesian optimization algorithm for hyperparameter optimization and apply the optimized hyperparameter value to the AdaBoost algorithm does not only improves the classification accuracy of the AdaBoost algorithm, but also avoids overfitting and underfitting of the model.

Cite

CITATION STYLE

APA

Gao, R., & Liu, Z. (2020). An Improved AdaBoost Algorithm for Hyperparameter Optimization. In Journal of Physics: Conference Series (Vol. 1631). IOP Publishing Ltd. https://doi.org/10.1088/1742-6596/1631/1/012048

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free