On the Convergence of the Multi-scale Deep Neural Network (MscaleDNN) in Approximating Oscillatory Functions
In this paper, we derive diffusion models for the error evolution for a learning algorithm by a multiscale deep neural network (MscaleDNN) <cit.> in approximating oscillatory functions and solutions of boundary value problem of differential equations. The diffusion models in the spectral domain for the error of the MscaleDNN trained by a gradient descent optimization algorithm are obtained when the learning rate goes to zero and the width of network goes to infinity. The diffusion coefficients of the models possess supports covering wider range of frequency as the number of scales used in MscaleDNN increases, compared to that for a normal fully connected neural network. Numerical results of the diffusion models shows faster error decay of the MscaleDNN over a wide frequency range, thus validating the advantages of using the MscaleDNN in the approximating highly oscillated functions.
READ FULL TEXT