Fast Stochastic Quadrature for Approximate Maximum-Likelihood Estimation

10/27/2020
by   Katharina Morik, et al.
0

Recent stochastic quadrature techniques for undirected graphical models rely on near-minimax degree-k polynomial approximations to the model’s potential function for inferring the partition function. While providing desirable statistical guarantees, typical constructions of such approximations are themselves not amenable to efficient inference. Here, we develop a class of Monte Carlo sampling algorithms for efficiently approximating the value of the partition function, as well as the associated pseudo-marginals. More precisely, for pairwise models with n vertices and m edges, the complexity can be reduced from O(d^k) to O(k^4 + kn + m), where d ≥4 m is the parameter dimension. We also consider the uses of stochastic quadrature for the problem of maximum-likelihood (ML) parameter estimation. For completely observed data, our analysis gives rise to a probabilistic bound on the log-likelihood of the model. Maximizing this bound yields an approximate ML estimate which, in analogy to the moment-matching of exact ML estimation, can be interpreted in terms of pseudo-moment-matching. We present experimental results illustrating the behavior of this approximate ML estimator.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset