On evolutionary approaches to unsupervised nearest neighbor regression

6Citations
Citations of this article
6Readers
Mendeley users who have this article in their library.
Get full text

Abstract

The detection of structures in high-dimensional data has an important part to play in machine learning. Recently, we proposed a fast iterative strategy for non-linear dimensionality reduction based on the unsupervised formulation of K-nearest neighbor regression. As the unsupervised nearest neighbor (UNN) optimization problem does not allow the computation of derivatives, the employment of direct search methods is reasonable. In this paper we introduce evolutionary optimization approaches for learning UNN embeddings. Two continuous variants are based on the CMA-ES employing regularization with domain restriction, and penalizing extension in latent space. A combinatorial variant is based on embedding the latent variables on a grid, and performing stochastic swaps. We compare the results on artificial dimensionality reduction problems. © 2012 Springer-Verlag.

Cite

CITATION STYLE

APA

Kramer, O. (2012). On evolutionary approaches to unsupervised nearest neighbor regression. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 7248 LNCS, pp. 346–355). https://doi.org/10.1007/978-3-642-29178-4_35

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free