Sorting high-dimensional patterns with unsupervised nearest neighbors

0Citations
Citations of this article
N/AReaders
Mendeley users who have this article in their library.
Get full text

Abstract

In many scientific disciplines structures in high-dimensional data have to be detected, e.g., in stellar spectra, genome data, or in face recognition tasks. In this work we present an approach to non-linear dimensionality reduction based on fitting nearest neighbor regression to the unsupervised regression framework for learning low-dimensional manifolds. The problem of optimizing latent neighborhoods is difficult to solve, but the unsupervised nearest neighbor (UNN) formulation allows an efficient strategy of iteratively embedding latent points to discrete neighborhood topologies. The choice of an appropriate loss function is relevant, in particular for noisy, and high-dimensional data spaces. We extend UNN by the ε-insensitive loss, which allows to ignore small residuals under a defined threshold. Furthermore, we introduce techniques to handle incomplete data. Experimental analyses on various artificial and real-world test problems demonstrates the performance of the approaches. © Springer-Verlag Berlin Heidelberg 2013.

Cite

CITATION STYLE

APA

Kramer, O. (2013). Sorting high-dimensional patterns with unsupervised nearest neighbors. In Communications in Computer and Information Science (Vol. 358, pp. 250–267). Springer Verlag. https://doi.org/10.1007/978-3-642-36907-0_17

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free