A Combinatorial approach to hyperparameter optimization

18Citations
Citations of this article
39Readers
Mendeley users who have this article in their library.
Get full text

Abstract

In machine learning, hyperparameter optimization (HPO) is essential for effective model training and significantly impacts model performance. Hyperparameters are predefined model settings which fine-tune the model's behavior and are critical to modeling complex data patterns. Traditional HPO approaches such as Grid Search, Random Search, and Bayesian Optimization have been widely used in this field. However, as datasets grow and models increase in complexity, these approaches often require a significant amount of time and resources for HPO. This research introduces a novel approach using t-way testing-a combinatorial approach to software testing used for identifying faults with a test set that covers all t-way interactions-for HPO. T-way testing substantially narrows the search space and effectively covers parameter interactions. Our experimental results show that our approach reduces the number of necessary model evaluations and significantly cuts computational expenses while still outperforming traditional HPO approaches for the models studied in our experiments.

Cite

CITATION STYLE

APA

Khadka, K., Chandrasekaran, J., Lei, Y., Kacker, R. N., & Kuhn, D. R. (2024). A Combinatorial approach to hyperparameter optimization. In Proceedings - 2024 IEEE/ACM 3rd International Conference on AI Engineering - Software Engineering for AI, CAIN 2024 (pp. 140–149). Association for Computing Machinery, Inc. https://doi.org/10.1145/3644815.3644941

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free