Support vector regression methods for functional data

10Citations
Citations of this article
6Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

Many regression tasks in practice dispose in low gear instance of digitized functions as predictor variables. This has motivated the development of regression methods for functional data. In particular, Naradaya-Watson Kernel (NWK) and Radial Basis Function (RBF) estimators have been recently extended to functional nonparametric regression models. However, these methods do not allow for dimensionality reduction. For this purpose, we introduce Support Vector Regression (SVR) methods for functional data. These are formulated in the framework of approximation in reproducing kernel Hubert spaces. On this general basis, some of its properties are investigated, emphasizing the construction of nonnegative definite kernels on functional spaces. Furthermore, the performance of SVR for functional variables is shown on a real world benchmark spectrometric data set, as well as comparisons with NWK and RBF methods. Good predictions were obtained by these three approaches, but SVR achieved in addition about 20% reduction of dimensionality. © Springer-Verlag Berlin Heidelberg 2007.

Cite

CITATION STYLE

APA

Hernández, N., Biscay, R. J., & Talavera, I. (2007). Support vector regression methods for functional data. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 4756 LNCS, pp. 564–573). https://doi.org/10.1007/978-3-540-76725-1_59

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free