Posterior Probabilities: Nonmonotonicity, Asymptotic Rates, Log-Concavity, and Turán's Inequality
In the standard Bayesian framework data are assumed to be generated by a distribution parametrized by θ in a parameter space Θ, over which a prior distribution π is given. A Bayesian statistician quantifies the belief that the true parameter is θ_0 in Θ by its posterior probability given the observed data. We investigate the behavior of the posterior belief in θ_0 when the data are generated under some parameter θ_1, which may or may not be the same as θ_0. Starting from stochastic orders, specifically, likelihood ratio dominance, that obtain for resulting distributions of posteriors, we consider monotonicity properties of the posterior probabilities as a function of the sample size when data arrive sequentially. While the θ_0-posterior is monotonically increasing (i.e., it is a submartingale) when the data are generated under that same θ_0, it need not be monotonically decreasing in general, not even in terms of its overall expectation, when the data are generated under a different θ_1. In fact, it may keep going up and down many times, even in simple cases such as iid coin tosses. We obtain precise asymptotic rates when the data come from the wide class of exponential families of distributions; these rates imply in particular that the expectation of the θ_0-posterior under θ_1≠θ_0 is eventually strictly decreasing. Finally, we show that in a number of interesting cases this expectation is a log-concave function of the sample size, and thus unimodal. In the Bernoulli case we obtain this by developing an inequality that is related to Turán's inequality for Legendre polynomials.
READ FULL TEXT