Variational Physics Informed Neural Networks: the role of quadratures and test functions

09/05/2021
by   Stefano Berrone, et al.
0

In this work we analyze how Gaussian or Newton-Cotes quadrature rules of different precisions and piecewise polynomial test functions of different degrees affect the convergence rate of Variational Physics Informed Neural Networks (VPINN) with respect to mesh refinement, while solving elliptic boundary-value problems. Using a Petrov-Galerkin framework relying on an inf-sup condition, we derive an a priori error estimate in the energy norm between the exact solution and a suitable high-order piecewise interpolant of a computed neural network. Numerical experiments confirm the theoretical predictions, and also indicate that the error decay follows the same behavior when the neural network is not interpolated. Our results suggest, somehow counterintuitively, that for smooth solutions the best strategy to achieve a high decay rate of the error consists in choosing test functions of the lowest polynomial degree, while using quadrature formulas of suitably high precision.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset