Nonparametric regression on low-dimensional manifolds using deep ReLU networks: function approximation and statistical recovery

26Citations
Citations of this article
18Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Real-world data often exhibit low-dimensional geometric structures and can be viewed as samples near a low-dimensional manifold. This paper studies nonparametric regression of Hölder functions on low-dimensional manifolds using deep Rectified Linear Unit (ReLU) networks. Suppose n training data are sampled from a Hölder function in H s,α supported on a d-dimensional Riemannian manifold isometrically embedded in RD. A deep ReLU network architecture is designed to estimate the underlying function from the training data. The mean squared error of the empirical estimator is proved to converge in 2(s+α) the order of n-2(s+α)+d log3 n. This result shows that deep ReLU networks give rise to a fast convergence rate depending on the data intrinsic dimension d, which is usually much smaller than the ambient dimension D. It therefore demonstrates the adaptivity of deep ReLU networks to low-dimensional geometric structures in data and partially explains the power of deep ReLU networks in tackling high-dimensional data with low-dimensional geometric structures.

Cite

CITATION STYLE

APA

Chen, M., Jiang, H., Liao, W., & Zhao, T. (2022). Nonparametric regression on low-dimensional manifolds using deep ReLU networks: function approximation and statistical recovery. Information and Inference, 11(4), 1203–1253. https://doi.org/10.1093/imaiai/iaac001

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free