A study of tuning hyperparameters for support vector machines

1Citations
Citations of this article
2Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Automatic parameters selection is an important issue to make support vector machines (SVMs) practically useful. Most existing approaches use Newton method directly to compute the optimal parameters. They treat parameters optimization as an unconstrained optimization problem. In this paper, the limitation of these existing approached is stated and a new methodology to optimize kernel parameters, based on the computation of the gradient of penalty function with respect to the RBF kernel parameters, is proposed. Simulation results reveal the feasibility of this new approach and demonstrate an improvement of generalization ability. © Springer-Verlag Berlin Heidelberg 2003.

Cite

CITATION STYLE

APA

Quan, Y., Yang, J., & Ye, C. (2003). A study of tuning hyperparameters for support vector machines. Lecture Notes in Computer Science (Including Subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 2667, 1006–1015. https://doi.org/10.1007/3-540-44839-x_106

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free