An algebraic characterization of the optimum of regularized kernel methods

8Citations
Citations of this article
16Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

The representer theorem for kernel methods states that the solution of the associated variational problem can be expressed as the linear combination of a finite number of kernel functions. However, for non-smooth loss functions, the analytic characterization of the coefficients poses nontrivial problems. Standard approaches resort to constrained optimization reformulations which, in general, lack a closed-form solution. Herein, by a proper change of variable, it is shown that, for any convex loss function, the coefficients satisfy a system of algebraic equations in a fixed-point form, which may be directly obtained from the primal formulation. The algebraic characterization is specialized to regression and classification methods and the fixed-point equations are explicitly characterized for many loss functions of practical interest. The consequences of the main result are then investigated along two directions. First, the existence of an unconstrained smooth reformulation of the original non-smooth problem is proven. Second, in the context of SURE (Stein's Unbiased Risk Estimation), a general formula for the degrees of freedom of kernel regression methods is derived. © 2009 Springer Science+Business Media, LLC.

Cite

CITATION STYLE

APA

Dinuzzo, F., & De Nicolao, G. (2009). An algebraic characterization of the optimum of regularized kernel methods. Machine Learning, 74(3), 315–345. https://doi.org/10.1007/s10994-008-5095-1

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free