Relaxed exponential kernels for unsupervised learning

5Citations
Citations of this article
10Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Many unsupervised learning algorithms make use of kernels that rely on the Euclidean distance between two samples. However, the Euclidean distance is optimal for Gaussian distributed data. In this paper, we relax the global Gaussian assumption made by the Euclidean distance, and propose a locale Gaussian modelling for the immediate neighbourhood of the samples, resulting in an augmented data space formed by the parameters of the local Gaussians. To this end, we propose a convolution kernel for the augmented data space. The factorisable nature of this kernel allows us to introduce (semi)-metrics for this space, which further derives relaxed versions of known kernels for this space. We present empirical results to validate the utility of the proposed localized approach in the context of spectral clustering. The key result of this paper is that this approach that combines the local Gaussian model with measures that adhere to metric properties, yields much better performance in different spectral clustering tasks. © 2011 Springer-Verlag.

Cite

CITATION STYLE

APA

Abou-Moustafa, K., Shah, M., De La Torre, F., & Ferrie, F. (2011). Relaxed exponential kernels for unsupervised learning. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 6835 LNCS, pp. 184–195). https://doi.org/10.1007/978-3-642-23123-0_19

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free