Exponential ReLU Neural Network Approximation Rates for Point and Edge Singularities

14Citations
Citations of this article
5Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

In certain polytopal domains Ω, in space dimension d= 2 , 3 , we prove exponential expressivity with stable ReLU Neural Networks (ReLU NNs) in H1(Ω) for weighted analytic function classes. These classes comprise in particular solution sets of source and eigenvalue problems for elliptic PDEs with analytic data. Functions in these classes are locally analytic on open subdomains D⊂ Ω, but may exhibit isolated point singularities in the interior of Ω or corner and edge singularities at the boundary ∂Ω. The exponential approximation rates are shown to hold in space dimension d= 2 on Lipschitz polygons with straight sides, and in space dimension d= 3 on Fichera-type polyhedral domains with plane faces. The constructive proofs indicate that NN depth and size increase poly-logarithmically with respect to the target NN approximation accuracy ε> 0 in H1(Ω). The results cover solution sets of linear, second-order elliptic PDEs with analytic data and certain nonlinear elliptic eigenvalue problems with analytic nonlinearities and singular, weighted analytic potentials as arise in electron structure models. Here, the functions correspond to electron densities that exhibit isolated point singularities at the nuclei.

Cite

CITATION STYLE

APA

Marcati, C., Opschoor, J. A. A., Petersen, P. C., & Schwab, C. (2023). Exponential ReLU Neural Network Approximation Rates for Point and Edge Singularities. Foundations of Computational Mathematics, 23(3), 1043–1127. https://doi.org/10.1007/s10208-022-09565-9

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free