On How Well Generative Adversarial Networks Learn Densities: Nonparametric and Parametric Results

11/07/2018
by   Tengyuan Liang, et al.
0

We study in this paper the rate of convergence for learning distributions with the Generative Adversarial Networks (GAN) framework, which subsumes Wasserstein, Sobolev and MMD GANs as special cases. We study a wide range of parametric and nonparametric target distributions, under a collection of objective evaluation metrics. On the nonparametric end, we investigate the minimax optimal rates and fundamental difficulty of the density estimation under the adversarial framework. On the parametric end, we establish theory for neural network classes, that characterizes the interplay between the choice of generator and discriminator. We investigate how to improve the GAN framework with better theoretical guarantee through the lens of regularization. We discover and isolate a new notion of regularization, called the generator/discriminator pair regularization, that sheds light on the advantage of GAN compared to classic parametric and nonparametric approaches for density estimation.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset