Convergence Rate Analysis for Deep Ritz Method

03/24/2021
by   Chenguang Duan, et al.
0

Using deep neural networks to solve PDEs has attracted a lot of attentions recently. However, why the deep learning method works is falling far behind its empirical success. In this paper, we provide a rigorous numerical analysis on deep Ritz method (DRM) <cit.> for second order elliptic equations with Neumann boundary conditions. We establish the first nonasymptotic convergence rate in H^1 norm for DRM using deep networks with ReLU^2 activation functions. In addition to providing a theoretical justification of DRM, our study also shed light on how to set the hyper-parameter of depth and width to achieve the desired convergence rate in terms of number of training samples. Technically, we derive bounds on the approximation error of deep ReLU^2 network in H^1 norm and on the Rademacher complexity of the non-Lipschitz composition of gradient norm and ReLU^2 network, both of which are of independent interest.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset