Scalable Cross Validation Losses for Gaussian Process Models

05/24/2021
by   Martin Jankowiak, et al.
0

We introduce a simple and scalable method for training Gaussian process (GP) models that exploits cross-validation and nearest neighbor truncation. To accommodate binary and multi-class classification we leverage Pòlya-Gamma auxiliary variables and variational inference. In an extensive empirical comparison with a number of alternative methods for scalable GP regression and classification, we find that our method offers fast training and excellent predictive performance. We argue that the good predictive performance can be traced to the non-parametric nature of the resulting predictive distributions as well as to the cross-validation loss, which provides robustness against model mis-specification.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset