Unsupervised kernel function building using maximization of information potential variability

27Citations
Citations of this article
8Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

We propose a kernel function estimation strategy to support machine learning tasks by analyzing the input samples using Renyi’s Information Metrics. Specifically, we aim to identify a Reproducing Kernel Hilbert Space spanning the most widely the information force among data points by the maximization of the information potential variability of Parzen-based pdf estimation. So, a Gaussian kernel bandwidth updating rule is obtained as a function of the forces induced by a given dataset. Our proposal is tested on synthetic and real-world datasets related to clustering and classification tasks. Obtained results show that presented approach allows to compute RKHS’s favoring data groups separability, attaining suitable learning performances in comparison with state of the art algorithms.

Cite

CITATION STYLE

APA

Álvarez-Meza, A. M., Cárdenas-Peña, D., & Castellanos-Domínguez, G. (2014). Unsupervised kernel function building using maximization of information potential variability. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 8827, pp. 335–342). Springer Verlag. https://doi.org/10.1007/978-3-319-12568-8_41

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free