Polynomial differentiation decreases the training time complexity of physics-informed neural networks and strengthens their approximation power

1Citations
Citations of this article
7Readers
Mendeley users who have this article in their library.

Abstract

We present novel approximates of variational losses, being applicable for the training of physics-informed neural networks (PINNs). The formulations reflect classic Sobolev space theory for partial differential equations (PDEs) and their weak formulations. The loss approximates rest on polynomial differentiation realised by an extension of classic Gauss-Legendre cubatures, we term Sobolev cubatures, and serve as a replacement of automatic differentiation. We prove the training time complexity of the resulting Sobolev -PINNs with polynomial differentiation to be less than required by PINNs relying on automatic differentiation. On top of one-to-two order of magnitude speed-up the Sobolev-PINNs are demonstrated to achieve closer solution approximations for prominent forward and inverse, linear and non-linear PDE problems compared to established PINNs.

Cite

CITATION STYLE

APA

Suarez Cardona, J. E., & Hecht, M. (2023). Polynomial differentiation decreases the training time complexity of physics-informed neural networks and strengthens their approximation power. Machine Learning: Science and Technology, 4(4). https://doi.org/10.1088/2632-2153/acf97a

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free