Limitations on approximation by deep and shallow neural networks

11/30/2022
by   Guergana Petrova, et al.
0

We prove Carl's type inequalities for the error of approximation of compact sets K by deep and shallow neural networks. This in turn gives lower bounds on how well we can approximate the functions in K when requiring the approximants to come from outputs of such networks. Our results are obtained as a byproduct of the study of the recently introduced Lipschitz widths.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset