Learning gradients: Predictive models that infer geometry and statistical dependence

ISSN: 15324435
15Citations
Citations of this article
70Readers
Mendeley users who have this article in their library.

Abstract

The problems of dimension reduction and inference of statistical dependence are addressed by the modeling framework of learning gradients. The models we propose hold for Euclidean spaces as well as the manifold setting. The central quantity in this approach is an estimate of the gradient of the regression or classification function. Two quadratic forms are constructed from gradient estimates: the gradient outer product and gradient based diffusion maps. The first quantity can be used for supervised dimension reduction on manifolds as well as inference of a graphical model encoding dependencies that are predictive of a response variable. The second quantity can be used for nonlinear projections that incorporate both the geometric structure of the manifold as well as variation of the response variable on the manifold. We relate the gradient outer product to standard statistical quantities such as covariances and provide a simple and precise comparison of a variety of supervised dimensionality reduction methods. We provide rates of convergence for both inference of informative directions as well as inference of a graphical model of variable dependencies. © 2010.

Cite

CITATION STYLE

APA

Wu, Q., Guinney, J., Maggioni, M., & Mukherjee, S. (2010). Learning gradients: Predictive models that infer geometry and statistical dependence. Journal of Machine Learning Research, 11, 2175–2198.

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free