Stochastic Variance-Reduced Hamilton Monte Carlo Methods

02/13/2018
by   Difan Zou, et al.
0

We propose a fast stochastic Hamilton Monte Carlo (HMC) method, for sampling from a smooth and strongly log-concave distribution. At the core of our proposed method is a variance reduction technique inspired by the recent advance in stochastic optimization. We show that, to achieve ϵ accuracy in 2-Wasserstein distance, our algorithm achieves Õ(n+κ^2d^1/2/ϵ+κ^4/3d^1/3n^2/3/ϵ^2/3) gradient complexity (i.e., number of component gradient evaluations), which outperforms the state-of-the-art HMC and stochastic gradient HMC methods in a wide regime. We also extend our algorithm for sampling from smooth and general log-concave distributions, and prove the corresponding gradient complexity as well. Experiments on both synthetic and real data demonstrate the superior performance of our algorithm.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset