Efficient approximations of kernel robust soft LVQ

2Citations
Citations of this article
4Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Robust soft learning vector quantization (RSLVQ) constitutes a probabilistic extension of learning vector quantization (LVQ) based on a labeled Gaussian mixture model of the data. Training optimizes the likelihood ratio of the model and recovers a variant similar to LVQ2.1 in the limit of small bandwidth. Recently, RSLVQ has been extended to a kernel version, thus opening the way towards more general data structures characterized in terms of a Gram matrix only. While leading to state of the art results, this extension has the drawback that models are no longer sparse, and quadratic training complexity is encountered. In this contribution, we investigate two approximation schemes which lead to sparse models: k-approximations of the prototypes and the Nyström approximation of the Gram matrix. We investigate the behavior of these approximations in a couple of benchmarks. © 2013 Springer-Verlag.

Cite

CITATION STYLE

APA

Hofmann, D., Gisbrecht, A., & Hammer, B. (2013). Efficient approximations of kernel robust soft LVQ. In Advances in Intelligent Systems and Computing (Vol. 198 AISC, pp. 183–192). Springer Verlag. https://doi.org/10.1007/978-3-642-35230-0_19

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free