Hyperparameter Tuning in Machine Learning: A Comprehensive Review

  • A Ilemobayo J
  • Durodola O
  • Alade O
  • et al.
N/ACitations
Citations of this article
116Readers
Mendeley users who have this article in their library.

Abstract

Hyperparameter tuning is essential for optimizing the performance and generalization of machine learning (ML) models. This review explores the critical role of hyperparameter tuning in ML, detailing its importance, applications, and various optimization techniques. Key factors influencing ML performance, such as data quality, algorithm selection, and model complexity, are discussed, along with the impact of hyperparameters like learning rate and batch size on model training. Various tuning methods are examined, including grid search, random search, Bayesian optimization, and meta-learning. Special focus is given to the learning rate in deep learning, highlighting strategies for its optimization. Trade-offs in hyperparameter tuning, such as balancing computational cost and performance gain, are also addressed. Concluding with challenges and future directions, this review provides a comprehensive resource for improving the effectiveness and efficiency of ML models.

Cite

CITATION STYLE

APA

A Ilemobayo, J., Durodola, O., Alade, O., J Awotunde, O., T Olanrewaju, A., Falana, O., … E Edu, O. (2024). Hyperparameter Tuning in Machine Learning: A Comprehensive Review. Journal of Engineering Research and Reports, 26(6), 388–395. https://doi.org/10.9734/jerr/2024/v26i61188

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free