Information Theoretic Sample Complexity Lower Bound for Feed-Forward Fully-Connected Deep Networks

07/01/2020
by   Xiaochen Yang, et al.
0

In this paper, we study the sample complexity lower bound of a d-layer feed-forward, fully-connected neural network for binary classification, using information-theoretic tools. Specifically, we propose a backward data generating process, where the input is generated based on the binary output, and the network is parametrized by weight parameters for the hidden layers. The sample complexity lower bound is of order Ω(log(r) + p / (r d)), where p is the dimension of the input, r is the rank of the weight matrices, and d is the number of hidden layers. To the best of our knowledge, our result is the first information theoretic sample complexity lower bound.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset