The distance between the weights of the neural network is meaningful

01/31/2021
by   Liqun Yang, et al.
0

In the application of neural networks, we need to select a suitable model based on the problem complexity and the dataset scale. To analyze the network's capacity, quantifying the information learned by the network is necessary. This paper proves that the distance between the neural network weights in different training stages can be used to estimate the information accumulated by the network in the training process directly. The experiment results verify the utility of this method. An application of this method related to the label corruption is shown at the end.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset