Stochastic Neural Networks with Infinite Width are Deterministic

01/30/2022
by   Liu Ziyin, et al.
8

This work theoretically studies stochastic neural networks, a main type of neural network in use. Specifically, we prove that as the width of an optimized stochastic neural network tends to infinity, its predictive variance on the training set decreases to zero. Two common examples that our theory applies to are neural networks with dropout and variational autoencoders. Our result helps better understand how stochasticity affects the learning of neural networks and thus design better architectures for practical problems.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset