Abstract
Hyperparameter tuning plays a significant role when building a machine learning or a deep learning model. The tuning process aims to find the optimal hyper-parameter setting for a model or algorithm from a pre-defined search space of the hyperparameters configurations. Several tuning algorithms have been proposed in recent years and there is scope for improvement in achieving a better exploration-exploitation tradeoff of the search space. In this paper, we present a novel hyperparameter tuning algorithm named adaptive Bayesian contextual hy-perband (Adaptive BCHB) that incorporates a new sampling approach to iden-tify best regions of the search space and exploit those configurations that produce minimum validation loss by dynamically updating the threshold in every iteration. The proposed algorithm is assessed using benchmark models and datasets on traditional machine learning tasks. The proposed Adaptive BCHB algorithm shows a significant improvement in terms of accuracy and computational time for different types of hyperparameters when compared with state-of-the-art tuning algorithms.
Author supplied keywords
Cite
CITATION STYLE
Swaminatha Rao, L. P., & Jaganathan, S. (2024). Adaptive Bayesian contextual hyperband: A novel hyperparameter optimization approach. IAES International Journal of Artificial Intelligence, 13(1), 775–785. https://doi.org/10.11591/ijai.v13.i1.pp775-785
Register to see more suggestions
Mendeley helps you to discover research relevant for your work.