Sparse Additive Gaussian Process Regression
In this paper we introduce a novel model for Gaussian process (GP) regression in the fully Bayesian setting. Motivated by the idea of sparsification, localization and Bayesian additive modeling, our model is built around a recursive partitioning (RP) scheme. Within each RP partition, a sparse GP regression model is fitted. A Bayesian additive framework combines the partitions, allowing the model to admit both global trends and local refinements on which a sparse GP construction enables efficient computation. The model addresses both the problem of efficiency in fitting a full Gaussian process regression model and the problem of prediction performance associated with a single sparse Gaussian process. Our approach mitigates the issue of pseudo-input selection and avoids the need for complex inter-block correlations in existing approaches. Furthermore, the proposed model can also capture non-stationarity. The crucial trade-off becomes choosing between many simpler local model components or fewer complex global model components, which can be easily and sensibly tuned by the practitioner. Implementation is via a straightforward Metropolis-Hasting Markov chain Monte-Carlo algorithm. We compare our model against popular alternatives with simulated and real datasets, and find the performance is competitive, while the fully Bayesian procedure enables the quantification of model uncertainties.
READ FULL TEXT