Learning bounds for support vector machines with learned kernels

46Citations
Citations of this article
40Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Consider the problem of learning a kernel for use in SVM classification. We bound the estimation error of a large margin classifier when the kernel, relative to which this margin is defined, is chosen from a family of kernels based on the training sample. For a kernel family with pseudodimension d φ, we present a bound of radic;script O sigñ(d φ + 1/γ2)/n on the estimation error for SVMs with margin γ. This is the first bound in which the relation between the margin term and the family-of-kernels term is additive rather then multiplicative. The pseudodimension of families of linear combinations of base kernels is the number of base kernels. Unlike in previous (multiplicative) bounds, there is no non-negativity requirement on the coefficients of the linear combinations. We also give simple bounds on the pseudodimension for families of Gaussian kernels. © Springer-Verlag Berlin Heidelberg 2006.

Cite

CITATION STYLE

APA

Srebro, N., & Ben-David, S. (2006). Learning bounds for support vector machines with learned kernels. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 4005 LNAI, pp. 169–183). Springer Verlag. https://doi.org/10.1007/11776420_15

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free