Analysis on Gradient Propagation in Batch Normalized Residual Networks

12/02/2018
by   Abhishek Panigrahi, et al.
0

We conduct mathematical analysis on the effect of batch normalization (BN) on gradient backpropogation in residual network training, which is believed to play a critical role in addressing the gradient vanishing/explosion problem, in this work. By analyzing the mean and variance behavior of the input and the gradient in the forward and backward passes through the BN and residual branches, respectively, we show that they work together to confine the gradient variance to a certain range across residual blocks in backpropagation. As a result, the gradient vanishing/explosion problem is avoided. We also show the relative importance of batch normalization w.r.t. the residual branches in residual networks.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset