Complexity analysis of Bayesian learning of high-dimensional DAG models and their equivalence classes

01/11/2021
by   Quan Zhou, et al.
0

We consider MCMC methods for learning equivalence classes of sparse Gaussian DAG models when p = e^o(n). The main contribution of this work is a rapid mixing result for a random walk Metropolis-Hastings algorithm, which we prove using a canonical path method. It reveals that the complexity of Bayesian learning of sparse equivalence classes grows only polynomially in n and p, under some common high-dimensional assumptions. Further, a series of high-dimensional consistency results is obtained by the path method, including the strong selection consistency of an empirical Bayes model for structure learning and the consistency of a greedy local search on the restricted search space. Rapid mixing and slow mixing results for other structure-learning MCMC methods are also derived. Our path method and mixing time results yield crucial insights into the computational aspects of high-dimensional structure learning, which may be used to develop more efficient MCMC algorithms.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset