Analyzing and Improving Stein Variational Gradient Descent for High-dimensional Marginal Inference

11/13/2017
by   Jingwei Zhuo, et al.
0

Stein variational gradient descent (SVGD) is a nonparametric inference method, which iteratively transports a set of randomly initialized particles to approximate a differentiable target distribution, along the direction that maximally decreases the KL divergence within a vector-valued reproducing kernel Hilbert space (RKHS). Compared to Monte Carlo methods, SVGD is particle-efficient because of the repulsive force induced by kernels. In this paper, we develop the first analysis about the high dimensional performance of SVGD and emonstrate that the repulsive force drops at least polynomially with increasing dimensions, which results in poor marginal approximation. To improve the marginal inference of SVGD, we propose Marginal SVGD (M-SVGD), which incorporates structural information described by a Markov random field (MRF) into kernels. M-SVGD inherits the particle efficiency of SVGD and can be used as a general purpose marginal inference tool for MRFs. Experimental results on grid based Markov random fields show the effectiveness of our methods.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset