On the Opportunities and Pitfalls of Nesting Monte Carlo Estimators

09/18/2017
by   Tom Rainforth, et al.
0

We present a formalization of nested Monte Carlo (NMC) estimation, whereby terms in an outer estimator themselves involve calculation of separate, nested, Monte Carlo (MC) estimators. We demonstrate that, under mild conditions, NMC can provide consistent estimates of nested expectations, including cases involving arbitrary levels of nesting; establish corresponding rates of convergence; and provide empirical evidence that these rates are observed in practice. We further establish a number of pitfalls that can arise from naive nesting of MC estimators, provide guidelines about how these can be avoided, and lay out novel methods for reformulating certain classes of nested expectation problems into single expectations, leading to improved convergence rates. Finally, we use one of these reformulations to derive a new estimator for use in discrete Bayesian experimental design problems which has a better convergence rate than existing methods. Our results have implications for a wide range of fields from probabilistic programming to deep generative models and serve both as an invitation for further inquiry and a caveat against careless use.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset