Structured variational approximations with skew normal decomposable graphical models
Although there is much recent work developing flexible variational methods for Bayesian computation, Gaussian approximations with structured covariance matrices are often preferred computationally in high-dimensional settings. This paper considers approximate inference methods for complex latent variable models where the posterior is close to Gaussian, but with some skewness in the posterior marginals. We consider skew decomposable graphical models (SDGMs), which are based on the closed skew normal family of distributions, as variational approximations. These approximations can reflect the true posterior conditional independence structure and capture posterior skewness. Different parametrizations are explored for this variational family, and the speed of convergence and quality of the approximation can depend on the parametrization used. To increase flexibility, implicit copula SDGM approximations are also developed, where elementwise transformations of an approximately standardized SDGM random vector are considered. Our parametrization of the implicit copula approximation is novel, even in the special case of a Gaussian approximation. Performance of the methods is examined in a number of real examples involving generalized linear mixed models and state space models, and we conclude that our copula approaches are most accurate, but that the SDGM methods are often nearly as good and have lower computational demands.
READ FULL TEXT