Wasserstein GAN

01/26/2017
by   Martin Arjovsky, et al.
0

We introduce a new algorithm named WGAN, an alternative to traditional GAN training. In this new model, we show that we can improve the stability of learning, get rid of problems like mode collapse, and provide meaningful learning curves useful for debugging and hyperparameter searches. Furthermore, we show that the corresponding optimization problem is sound, and provide extensive theoretical work highlighting the deep connections to other distances between distributions.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset