Bayesian Inference using the Proximal Mapping: Uncertainty Quantification under Varying Dimensionality

08/10/2021
by   Maoran Xu, et al.
0

In statistical applications, it is common to encounter parameters supported on a varying or unknown dimensional space. Examples include the fused lasso regression, the matrix recovery under an unknown low rank, etc. Despite the ease of obtaining a point estimate via the optimization, it is much more challenging to quantify their uncertainty – in the Bayesian framework, a major difficulty is that if assigning the prior associated with a p-dimensional measure, then there is zero posterior probability on any lower-dimensional subset with dimension d<p; to avoid this caveat, one needs to choose another dimension-selection prior on d, which often involves a highly combinatorial problem. To significantly reduce the modeling burden, we propose a new generative process for the prior: starting from a continuous random variable such as multivariate Gaussian, we transform it into a varying-dimensional space using the proximal mapping. This leads to a large class of new Bayesian models that can directly exploit the popular frequentist regularizations and their algorithms, such as the nuclear norm penalty and the alternating direction method of multipliers, while providing a principled and probabilistic uncertainty estimation. We show that this framework is well justified in the geometric measure theory, and enjoys a convenient posterior computation via the standard Hamiltonian Monte Carlo. We demonstrate its use in the analysis of the dynamic flow network data.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset