Quantitative Gaussian Approximation of Randomly Initialized Deep Neural Networks

03/14/2022
by   Andrea Basteri, et al.
0

Given any deep fully connected neural network, initialized with random Gaussian parameters, we bound from above the quadratic Wasserstein distance between its output distribution and a suitable Gaussian process. Our explicit inequalities indicate how the hidden and output layers sizes affect the Gaussian behaviour of the network and quantitatively recover the distributional convergence results in the wide limit, i.e., if all the hidden layers sizes become large.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset