Gibbs posterior convergence and the thermodynamic formalism
In this paper we consider a Bayesian framework for making inferences about dynamical systems from ergodic observations. The proposed Bayesian procedure is based on the Gibbs posterior, a decision theoretic generalization of standard Bayesian inference. We place a prior over a model class consisting of a parametrized family of Gibbs measures on a mixing shift of finite type. This model class generalizes (hidden) Markov chain models by allowing for long range dependencies, including Markov chains of arbitrarily large orders. We characterize the asymptotic behavior of the Gibbs posterior distribution on the parameter space as the number of observations tends to infinity. In particular, we define a limiting variational problem over the space of joinings of the model system with the observed system, and we show that the Gibbs posterior distributions concentrate around the solution set of this variational problem. In the case of properly specified models our convergence results may be used to establish posterior consistency. This work establishes tight connections between Gibbs posterior inference and the thermodynamic formalism, which may inspire new proof techniques in the study of Bayesian posterior consistency for dependent processes.
READ FULL TEXT