Frugal Optimization for Cost-related Hyperparameters

25Citations
Citations of this article
27Readers
Mendeley users who have this article in their library.

Abstract

The increasing demand for democratizing machine learning algorithms calls for hyperparameter optimization (HPO) solutions at low cost. Many machine learning algorithms have hyperparameters which can cause a large variation in the training cost. But this effect is largely ignored in existing HPO methods, which are incapable to properly control cost during the optimization process. To address this problem, we develop a new cost-frugal HPO solution. The core of our solution is a simple but new randomized direct-search method, for which we provide theoretical guarantees on the convergence rate and the total cost incurred to achieve convergence. We provide strong empirical results in comparison with state-of-the-art HPO methods on large AutoML benchmarks.

Cite

CITATION STYLE

APA

Wu, Q., Wang, C., & Huang, S. (2021). Frugal Optimization for Cost-related Hyperparameters. In 35th AAAI Conference on Artificial Intelligence, AAAI 2021 (Vol. 12A, pp. 10347–10354). Association for the Advancement of Artificial Intelligence. https://doi.org/10.1609/aaai.v35i12.17239

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free