Stein variational gradient descent on infinite-dimensional space and applications to statistical inverse problems

02/19/2021
by   Junxiong Jia, et al.
0

For solving Bayesian inverse problems governed by large-scale forward problems, we present an infinite-dimensional version of the Stein variational gradient descent (iSVGD) method, which has the ability to generate approximate samples from the posteriors efficiently. Specifically, we introduce the concept of the operator-valued kernel and the corresponding function-valued reproducing kernel Hilbert space (RKHS). Through the properties of RKHS, we give an explicit meaning of the infinite-dimensional objects (e.g., the Stein operator) and prove that the infinite-dimensional objects are indeed the limit of finite-dimensional items. Furthermore, by generalizing the change of variables formula, we construct iSVGD with preconditioning operators, yielding more efficient iSVGD. During these generalizations, we introduce a regularity parameter s∈[0,1]. Our analysis shows that the intuitive trivial version (i.e., by directly taking finite-dimensional objects as infinite-dimensional items) of iSVGD with preconditioning operators (s=0) will yield inaccurate estimates, and the parameter s should be chosen larger than 0 and smaller than 0.5. Finally, the proposed algorithms are applied to an inverse problem governed by the Helmholtz equation. Numerical results confirm the correctness of our theoretical findings and demonstrate the potential usefulness of the proposed approach in the posterior sampling of large-scale nonlinear statistical inverse problems.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset