Hyper-parameter tuning under a budget constraint

9Citations
Citations of this article
37Readers
Mendeley users who have this article in their library.

Abstract

Hyper-parameter tuning is of crucial importance for real-world machine learning applications. While existing works mainly focus on speeding up the tuning process, we propose to study the problem of hyper-parameter tuning under a budget constraint, which is a more realistic scenario in developing large-scale systems. We formulate the task into a sequential decision making problem and propose a solution, which uses a Bayesian belief model to predict future performances, and an action-value function to plan and select the next configuration to run. With long term prediction and planning capability, our method is able to early stop unpromising configurations, and adapt the tuning behaviors to different constraints. Experiment results show that our method outperforms existing algorithms, including the-state-of-the-art one, on real-world tuning tasks across a range of different budgets.

Cite

CITATION STYLE

APA

Lu, Z., Chen, L., Chiang, C. K., & Sha, F. (2019). Hyper-parameter tuning under a budget constraint. In IJCAI International Joint Conference on Artificial Intelligence (Vol. 2019-August, pp. 5744–5750). International Joint Conferences on Artificial Intelligence. https://doi.org/10.24963/ijcai.2019/796

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free