In our previous work we have shown that Mahalanobis kernels are useful for support vector classifiers both from generalization ability and model selection speed. In this paper we propose using Mahalanobis kernels for function approximation. We determine the covariance matrix for the Mahalanobis kernel using all the training data. Model selection is done by line search. Namely, first the margin parameter and the error threshold are optimized and then the kernel parameter is optimized. According to the computer experiments for four benchmark problems, estimation performance of a Mahalanobis kernel with a diagonal covariance matrix optimized by line search is comparable to or better than that of an RBF kernel optimized by grid search. © Springer-Verlag Berlin Heidelberg 2006.
CITATION STYLE
Kamada, Y., & Abe, S. (2006). Support vector regression using Mahalanobis kernels. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 4087 LNAI, pp. 144–152). Springer Verlag. https://doi.org/10.1007/11829898_13
Mendeley helps you to discover research relevant for your work.