On the relationship between variational inference and adaptive importance sampling

07/24/2019
by   Axel Finke, et al.
0

The importance weighted autoencoder (IWAE) (Burda et al., 2016) and reweighted wake-sleep (RWS) algorithm (Bornschein and Bengio, 2015) are popular approaches which employ multiple samples to achieve bias reductions compared to standard variational methods. However, their relationship has hitherto been unclear. We introduce a simple, unified framework for multi-sample variational inference termed adaptive importance sampling for learning (AISLE) and show that it admits IWAE and RWS as special cases. Through a principled application of a variance-reduction technique from Tucker et al. (2019), we also show that the sticking-the-landing (STL) gradient from Roeder et al. (2017), which previously lacked theoretical justification, can be recovered as a special case of RWS (and hence of AISLE). In particular, this indicates that the breakdown of RWS -- but not of STL -- observed in Tucker et al. (2019) may not be attributable to the lack of a joint objective for the generative-model and inference-network parameters as previously conjectured. Finally, we argue that our adaptive-importance-sampling interpretation of variational inference leads to more natural and principled extensions to sequential Monte Carlo methods than the IWAE-type multi-sample objective interpretation.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset