We introduce a class of analytic positive definite multivariate kernels which includes infinite dot product kernels as sometimes used in machine learning, certain new nonlinearly factorizable kernels, and a kernel which is closely related to the Gaussian. Each such kernel reproduces in a certain "native" Hilbert space of multivariate analytic functions. If functions from this space are interpolated in scattered locations by translates of the kernel, we prove spectral convergence rates of the interpolants and all derivatives. By truncation of the power series of the kernel-based interpolants, we constructively generalize the classical Bernstein theorem concerning polynomial approximation of analytic functions to the multivariate case. An application to machine learning algorithms is presented. © 2008 Springer Science+Business Media, LLC.
CITATION STYLE
Zwicknagl, B. (2009). Power series kernels. Constructive Approximation, 29(1), 61–84. https://doi.org/10.1007/s00365-008-9012-4
Mendeley helps you to discover research relevant for your work.