In this paper we propose a simple and intuitive method for constructing partially linear models and, in general, partially parametric models, using support vector machines for regression and, in particular, using regularization networks (splines). The results are more satisfactory than those for classical nonparametric approaches. The method is based on a suitable approach to selecting the kernel by relaying on the properties of positive definite functions. No modification is required of the standard SVM algorithms, and the approach is valid for the ε-insensitive loss. The approach described here can be immediately applied to SVMs for classification and to other methods that use the kernel as the inner product. © Springer-Verlag Berlin Heidelberg 2005.
CITATION STYLE
Matiás, J. M. (2005). Partially parametric SVM. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 3808 LNCS, pp. 67–75). Springer Verlag. https://doi.org/10.1007/11595014_7
Mendeley helps you to discover research relevant for your work.