Quadratically constrained quadratic programming for subspace selection in kernel regression estimation

4Citations
Citations of this article
10Readers
Mendeley users who have this article in their library.
Get full text

Abstract

In this contribution we consider the problem of regression estimation. We elaborate on a framework based on functional analysis giving rise to structured models in the context of reproducing kernel Hilbert spaces. In this setting the task of input selection is converted into the task of selecting functional components depending on one (or more) inputs. In turn the process of learning with embedded selection of such components can be formalized as a convex-concave problem. This results in a practical algorithm that can be implemented as a quadratically constrained quadratic programming (QCQP) optimization problem. We further investigate the mechanism of selection for the class of linear functions, establishing a relationship with LASSO. © Springer-Verlag Berlin Heidelberg 2008.

Cite

CITATION STYLE

APA

Signoretto, M., Pelckmans, K., & Suykens, J. A. K. (2008). Quadratically constrained quadratic programming for subspace selection in kernel regression estimation. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 5163 LNCS, pp. 175–184). https://doi.org/10.1007/978-3-540-87536-9_19

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free