Reference Bayesian analysis for hierarchical models
This paper proposes an alternative approach for constructing invariant Jeffreys prior distributions tailored for hierarchical or multilevel models. In particular, our proposal is based on a flexible decomposition of the Fisher information for hierarchical models which overcomes the marginalization step of the likelihood of model parameters. The Fisher information matrix for the hierarchical model is derived from the Hessian of the Kullback-Liebler (KL) divergence for the model in a neighborhood of the parameter value of interest. Properties of the KL divergence are used to prove the proposed decomposition. Our proposal takes advantage of the hierarchy and leads to an alternative way of computing Jeffreys priors for the hyperparameters and an upper bound for the prior information. While the Jeffreys prior gives the minimum information about parameters, the proposed bound gives an upper limit for the information put in any prior distribution. A prior with information above that limit may be considered too informative. From a practical point of view, the proposed prior may be evaluated computationally as part of a MCMC algorithm. This property might be essential for modeling setups with many levels in which analytic marginalization is not feasible. We illustrate the usefulness of our proposal with examples in mixture models, in model selection priors such as lasso and in the Student-t model.
READ FULL TEXT