Unsupervised nearest neighbors with kernels

7Citations
Citations of this article
4Readers
Mendeley users who have this article in their library.
Get full text

Abstract

In this paper we introduce an extension of unsupervised nearest neighbors for embedding patterns into continuous latent spaces of arbitrary dimensionality with stochastic sampling. Distances in data space are employed as standard deviation for Gaussian sampling in latent space. Neighborhoods are preserved with the nearest neighbor data space reconstruction error. Similar to the previous unsupervised nearest neighbors (UNN) variants this approach is an iterative method that constructs a latent embedding by selecting the position with the lowest error. Further, we introduce kernel functions for computing the data space reconstruction error in a feature space that allows to better handle non-linearities. Experimental studies show that kernel unsupervised nearest neighbors (KUNN) is an efficient method for embedding high-dimensional patterns. © 2012 Springer-Verlag.

Cite

CITATION STYLE

APA

Kramer, O. (2012). Unsupervised nearest neighbors with kernels. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 7526 LNAI, pp. 97–106). https://doi.org/10.1007/978-3-642-33347-7_9

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free