Kernel online learning with adaptive kernel width

34Citations
Citations of this article
32Readers
Mendeley users who have this article in their library.
Get full text

Abstract

This paper discusses a unified framework for kernel online learning (KOL) algorithm with adaptive kernels. Unlike the traditional KOL algorithms which applied a fixed kernel width in the training process, the kernel width is considered as an additional free parameter and can be adapted automatically. A robust training method is proposed based on an adaptive dead zone scheme. The kernel weight and the kernel width are updated under a unified framework, where they share the same learning parameters. We present a theoretical convergence analysis of the proposed adaptive training method which can switch off the learning when the training error is too small in terms of external disturbance. Meanwhile, in the regularization of the kernel function number, an in-depth measure concept: the cumulative coherence is applied. A dictionary with predefined size is selected by online minimization of its cumulative coherence without using any parameters related to the prior knowledge of the training samples. Simulation results show that the proposed algorithm can adapt the training data effectively with different initial kernel width. Its performance could be better in both testing accuracy and convergence speed compared with the kernel algorithms with a fixed kernel width.

Cite

CITATION STYLE

APA

Fan, H., Song, Q., & Shrestha, S. B. (2015). Kernel online learning with adaptive kernel width. Neurocomputing, 175(PartA), 233–242. https://doi.org/10.1016/j.neucom.2015.10.055

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free