Algorithmic Bayesian Group Gibbs Selection
Bayesian model selection, with precedents in George and McCulloch (1993) and Abramovich et al. (1998), support credibility measures that relate model uncertainty, but computation can be costly when sparse priors are approximate. We design an exact selection engine suitable for Gauss noise, t-distributed noise, and logistic learning, benefiting from data-structures derived from coordinate descent lasso. Gibbs sampler chains are stored in a compressed binary format compatible with Equi-Energy (Kou et al., 2006) tempering. We achieve a grouped-effects selection model, similar to the setting for group lasso, to determine co-entry of coefficients into the model. We derive a functional integrand for group inclusion, and introduce a new MCMC switching step to avoid numerical integration. Theorems show this step has exponential convergence to target distribution. We demonstrate a role for group selection to inform on genetic decomposition in a diallel experiment, and identify potential quantitative trait loci in p > 40K Heterogenous Stock haplotype/phenotype studies.
READ FULL TEXT