Hyper-parameter tuning for support vector machines by estimation of distribution algorithms

27Citations
Citations of this article
47Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Hyper-parameter tuning for support vector machines has been widely studied in the past decade. A variety of metaheuristics, such as Genetic Algorithms and Particle Swarm Optimization have been considered to accomplish this task. Notably, exhaustive strategies such as Grid Search or Random Search continue to be implemented for hyper-parameter tuning and have recently shown results comparable to sophisticated metaheuristics. The main reason for the success of exhaustive techniques is due to the fact that only two or three parameters need to be adjusted when working with support vector machines. In this chapter, we analyze two Estimation Distribution Algorithms, the Univariate Marginal Distribution Algorithm and the Boltzmann Univariate Marginal Distribution Algorithm, to verify if these algorithms preserve the effectiveness of Random Search and at the same time make more efficient the process of finding the optimal hyper-parameters without increasing the complexity of Random Search.

Cite

CITATION STYLE

APA

Padierna, L. C., Carpio, M., Rojas, A., Puga, H., Baltazar, R., & Fraire, H. (2017). Hyper-parameter tuning for support vector machines by estimation of distribution algorithms. In Studies in Computational Intelligence (Vol. 667, pp. 787–800). Springer Verlag. https://doi.org/10.1007/978-3-319-47054-2_53

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free