Deterministic error analysis of support vector regression and related regularized kernel methods

ISSN: 15324435
22Citations
Citations of this article
25Readers
Mendeley users who have this article in their library.

Abstract

We introduce a new technique for the analysis of kernel-based regression problems. The basic tools are sampling inequalities which apply to all machine learning problems involving penalty terms induced by kernels related to Sobolev spaces. They lead to explicit deterministic results concerning the worst case behaviour of e- and v-SVRs. Using these, we show how to adjust regularization parameters to get best possible approximation orders for regression. The results are illustrated by some numerical examples. ©2009 Christian Rieger and Barbara Zwicknagl.

Cite

CITATION STYLE

APA

Rieger, C., & Zwicknagl, B. (2009). Deterministic error analysis of support vector regression and related regularized kernel methods. Journal of Machine Learning Research, 10, 2115–2132.

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free