Support vector regression using Mahalanobis kernels

8Citations
Citations of this article
5Readers
Mendeley users who have this article in their library.

Abstract

In our previous work we have shown that Mahalanobis kernels are useful for support vector classifiers both from generalization ability and model selection speed. In this paper we propose using Mahalanobis kernels for function approximation. We determine the covariance matrix for the Mahalanobis kernel using all the training data. Model selection is done by line search. Namely, first the margin parameter and the error threshold are optimized and then the kernel parameter is optimized. According to the computer experiments for four benchmark problems, estimation performance of a Mahalanobis kernel with a diagonal covariance matrix optimized by line search is comparable to or better than that of an RBF kernel optimized by grid search. © Springer-Verlag Berlin Heidelberg 2006.

Cite

CITATION STYLE

APA

Kamada, Y., & Abe, S. (2006). Support vector regression using Mahalanobis kernels. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 4087 LNAI, pp. 144–152). Springer Verlag. https://doi.org/10.1007/11829898_13

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free