WiSE-VAE: Wide Sample Estimator VAE

02/16/2019
by   Shuyu Lin, et al.
1

Variational Auto-encoders (VAEs) have been very successful as methods for forming compressed latent representations of complex, often high-dimensional, data. In this paper, we derive an alternative variational lower bound from the one common in VAEs, which aims to minimize aggregate information loss. Using our lower bound as the objective function for an auto-encoder enables us to place a prior on the bulk statistics, corresponding to an aggregate posterior of all latent codes, as opposed to a single code posterior as in the original VAE. This alternative form of prior constraint allows individual posteriors more flexibility to preserve necessary information for good reconstruction quality. We further derive an analytic approximation to our lower bound, leading to our proposed model - WiSE-VAE. Through various examples, we demonstrate that WiSE-VAE can reach excellent reconstruction quality in comparison to other state-of-the-art VAE models, while still retaining the ability to learn a smooth, compact representation.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset