Model comparison for Gibbs random fields using noisy reversible jump Markov chain Monte Carlo
The reversible jump Markov chain Monte Carlo (RJMCMC) method offers an across-model simulation approach for Bayesian estimation and model comparison, by exploring the sampling space that consists of several models of varying dimensions. The implementation of RJMCMC to models like Gibbs random fields suffers from computational difficulties: the posterior distribution for each model is termed doubly-intractable since computation of the likelihood function is rarely available. Consequently, it is simply impossible to simulate a transition of the Markov chain in the presence of likelihood intractability. In this paper we present a variant of RJMCMC, called noisy RJMCMC, where we replace the underlying transition kernel with an approximation based on unbiased estimators. Building upon the theoretical developments of Alquier et al. (2016), we provide convergence guarantees for the noisy RJMCMC algorithm. Our experiments show that the noisy RJMCMC algorithm can be much more efficient than other exact methods, provided that an estimator with lower variance is used, a fact which is in agreement with our theoretical analysis.
READ FULL TEXT