Bayesian variational inference for exponential random graph models
Bayesian inference for exponential random graphs (ERGMs) is a doubly intractable problem as the normalizing constants of both the likelihood and the posterior density are intractable. Markov chain Monte Carlo (MCMC) methods which yield Bayesian inference for ERGMs, such as the exchange algorithm, are asymptotically exact but computationally intensive, as a network has to be drawn from the likelihood at every step using some MCMC algorithm, a "tie no tie" sampler for instance. In this article, we propose several variational methods for posterior density estimation and model selection, which includes (1) nonconjugate variational message passing combined with adjusted pseudolikelihood and (2) stochastic variational inference based on Monte Carlo or importance sampling. These methods yield approximate Bayesian inference but can be up to orders of magnitude faster than MCMC. We illustrate these variational methods using real social networks and compare their accuracy with results obtained via MCMC.
READ FULL TEXT