Which is better? Regularization in RKHS vs ℝm on reduced SVMs

3Citations
Citations of this article
7Readers
Mendeley users who have this article in their library.

Abstract

In SVMs community, the learning results are always a combination of the selected functions. SVMs have two mainly regularization models to find the combination coefficients. The most popular model with m input samples is norm-regularized the classification function in a reproducing kernel Hilbert space(RKHS), and it is converted to an optimization problem in ℝm by duality or representer theorem. Another important model is generalized support vector machine(GSVM), in which the coefficients of the hypothesis is norm-regularized in the Euclidean space ℝm. In this work, we analyze the difference between them on computing stability, computational complexity and the efficiency of the Newton type algorithms, especially on the reduced SVMs for large scale training problems. Many typical loss functions are considered. Our studies show that the model of GSVM has more advantages than the other model. Some experiments are given to support our analysis.

Cite

CITATION STYLE

APA

Zhou, S. (2013). Which is better? Regularization in RKHS vs ℝm on reduced SVMs. Statistics, Optimization and Information Computing, 1(1), 82–106. https://doi.org/10.19139/soic.v1i1.27

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free