Combining local and global information for nonlinear dimensionality reduction

14Citations
Citations of this article
19Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Nonlinear dimensionality reduction is a challenging problem encountered in a variety of high dimensional data analysis, including machine learning, pattern recognition, scientific visualization, and neural computation. Based on the different geometric intuitions of manifolds, maximum variance unfolding (MVU) and Laplacian eigenmaps are designed for detecting the different aspects of dataset. In this paper, combining the ideas of MVU and Laplacian eigenmaps, we propose a new nonlinear dimensionality reduction method called distinguishing variance embedding (DVE). DVE unfolds the dataset by maximizing the global variance subject to the proximity relation preservation constraint originated in Laplacian eigenmaps. We illustrate the algorithm on easily visualized examples of curves and surfaces, as well as on the actual images of rotating objects, faces, and handwritten digits. © 2009 Elsevier B.V. All rights reserved.

Cite

CITATION STYLE

APA

Wang, Q., & Li, J. (2009). Combining local and global information for nonlinear dimensionality reduction. Neurocomputing, 72(10–12), 2235–2241. https://doi.org/10.1016/j.neucom.2009.01.006

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free