Distributed Bayesian Varying Coefficient Modeling Using a Gaussian Process Prior

06/01/2020
by   Rajarshi Guhaniyogi, et al.
0

Varying coefficient models (VCMs) are widely used for estimating nonlinear regression functions in functional data models. Their Bayesian variants using Gaussian process (GP) priors on the functional coefficients, however, have received limited attention in massive data applications. This is primarily due to the prohibitively slow posterior computations using Markov chain Monte Carlo (MCMC) algorithms. We address this problem using a divide-and-conquer Bayesian approach that operates in three steps. The first step creates a large number of data subsets with much smaller sample sizes by sampling without replacement from the full data. The second step formulates VCM as a linear mixed-effects model and develops a data augmentation (DA)-type algorithm for obtaining MCMC draws of the parameters and predictions on all the subsets in parallel. The DA-type algorithm appropriately modifies the likelihood such that every subset posterior distribution is an accurate approximation of the corresponding true posterior distribution. The third step develops a combination algorithm for aggregating MCMC-based estimates of the subset posterior distributions into a single posterior distribution called the Aggregated Monte Carlo (AMC) posterior. Theoretically, we derive minimax optimal posterior convergence rates for the AMC posterior distributions of both the varying coefficients and the mean regression function. We provide quantification on the orders of subset sample sizes and the number of subsets according to the smoothness properties of the multivariate GP. The empirical results show that the combination schemes that satisfy our theoretical assumptions, including the one in the AMC algorithm, have better nominal coverage, shorter credible intervals, smaller mean square errors, and higher effective sample size than their main competitors across diverse simulations and in a real data analysis.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset