A common belief in high-dimensional data analysis is that data are concentrated on a low-dimensional manifold. This motivates simultaneous dimension reduction and regression on manifolds. We provide an algorithm for learning gradients on manifolds for dimension reduction for high-dimensional data with few observations. We obtain generalization error bounds for the gradient estimates and show that the convergence rate depends on the intrinsic dimension of the manifold and not on the dimension of the ambient space. We illustrate the efficacy of this approach empirically on simulated and real data and compare the method to other dimension reduction procedures. © 2010 ISI/BS.
CITATION STYLE
Mukherjee, S., Wu, Q., & Zhou, D. X. (2010). Learning gradients on manifolds. Bernoulli, 16(1), 181–207. https://doi.org/10.3150/09-BEJ206
Mendeley helps you to discover research relevant for your work.