Continuous Hyper-parameter Configuration for Particle Swarm Optimization via Auto-tuning

3Citations
Citations of this article
12Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

Hyper-Parameter configuration is a relatively novel field of paramount importance in machine learning and optimization. Hyper-parameters refers to the parameters that control the behavior of algorithms and are not tuned directly by such algorithms. For hyper-parameters of an optimization algorithm such as Particle Swarm Optimization, hyper-parameter configuration is a nested optimization problem. Usually, practitioners needs to use a second optimization algorithm such as grid search or random search to find proper hyper-parameters. However, this approach forces practitioners to know about two different algorithms. Moreover, hyper-parameter configuration algorithms also have hyper-parameters that need to be considered. In this work we use Particle Swarm Optimization to configure its own hyper-parameters. Results show that hyper-parameters configured by PSO are competitive with hyper-parameters found by other hyper-parameter configuration algorithms.

Cite

CITATION STYLE

APA

Rojas-Delgado, J., Milián Núñez, V., Trujillo-Rasúa, R., & Bello, R. (2019). Continuous Hyper-parameter Configuration for Particle Swarm Optimization via Auto-tuning. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 11896 LNCS, pp. 458–468). Springer. https://doi.org/10.1007/978-3-030-33904-3_43

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free