Learning gradients on manifolds

29Citations
Citations of this article
60Readers
Mendeley users who have this article in their library.

Abstract

A common belief in high-dimensional data analysis is that data are concentrated on a low-dimensional manifold. This motivates simultaneous dimension reduction and regression on manifolds. We provide an algorithm for learning gradients on manifolds for dimension reduction for high-dimensional data with few observations. We obtain generalization error bounds for the gradient estimates and show that the convergence rate depends on the intrinsic dimension of the manifold and not on the dimension of the ambient space. We illustrate the efficacy of this approach empirically on simulated and real data and compare the method to other dimension reduction procedures. © 2010 ISI/BS.

Cite

CITATION STYLE

APA

Mukherjee, S., Wu, Q., & Zhou, D. X. (2010). Learning gradients on manifolds. Bernoulli, 16(1), 181–207. https://doi.org/10.3150/09-BEJ206

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free