Deep Snow: Synthesizing Remote Sensing Imagery with Generative Adversarial Nets

05/18/2020
by   Christopher X. Ren, et al.
0

In this work we demonstrate that generative adversarial networks (GANs) can be used to generate realistic pervasive changes in remote sensing imagery, even in an unpaired training setting. We investigate some transformation quality metrics based on deep embedding of the generated and real images which enable visualization and understanding of the training dynamics of the GAN, and may provide a useful measure in terms of quantifying how distinguishable the generated images are from real images. We also identify some artifacts introduced by the GAN in the generated images, which are likely to contribute to the differences seen between the real and generated samples in the deep embedding feature space even in cases where the real and generated samples appear perceptually similar.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset