Multi-Objective Hyperparameter Optimization in Machine Learning—An Overview

62Citations
Citations of this article
92Readers
Mendeley users who have this article in their library.

Abstract

Hyperparameter optimization constitutes a large part of typical modern machine learning (ML) workflows. This arises from the fact that ML methods and corresponding preprocessing steps often only yield optimal performance when hyperparameters are properly tuned. But in many applications, we are not only interested in optimizing ML pipelines solely for predictive accuracy; additional metrics or constraints must be considered when determining an optimal configuration, resulting in a multi-objective optimization problem. This is often neglected in practice, due to a lack of knowledge and readily available software implementations for multi-objective hyperparameter optimization. In this work, we introduce the reader to the basics of multi-objective hyperparameter optimization and motivate its usefulness in applied ML. Furthermore, we provide an extensive survey of existing optimization strategies from the domains of evolutionary algorithms and Bayesian optimization. We illustrate the utility of multi-objective optimization in several specific ML applications, considering objectives such as operating conditions, prediction time, sparseness, fairness, interpretability, and robustness.

Cite

CITATION STYLE

APA

Karl, F., Pielok, T., Moosbauer, J., Pfisterer, F., Coors, S., Binder, M., … Bischl, B. (2023). Multi-Objective Hyperparameter Optimization in Machine Learning—An Overview. ACM Transactions on Evolutionary Learning and Optimization, 3(4). https://doi.org/10.1145/3610536

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free