On a Metropolis-Hastings importance sampling estimator
A classical approach for approximating expectations of functions w.r.t. partially known distributions is to compute the average of function values along a trajectory of a Metropolis-Hastings (MH) Markov chain. A key part in the MH algorithm is a suitable acceptance/rejection of a proposed state, which ensures the correct stationary distribution of the resulting Markov chain. However, the rejection of proposals also causes highly correlated samples. In particular, when a state is rejected it is not taken any further into account. In contrast to that we introduce a MH importance sampling estimator which explicitly incorporates all proposed states generated by the MH algorithm. The estimator satisfies a strong law of large numbers as well as a central limit theorem and, in addition to that, we provide an explicit mean squared error bound. Remarkably, the asymptotic variance of the MH importance sampling estimator does not involve any correlation term in contrast to its classical counterpart. Moreover, although the new estimator uses the same amount of information as the classical MH estimator, it can outperform the latter as indicated by numerical experiments.
READ FULL TEXT