Quadruply Stochastic Gaussian Processes

06/04/2020
by   Trefor W. Evans, et al.
0

We introduce a stochastic variational inference procedure for training scalable Gaussian process (GP) models whose per-iteration complexity is independent of both the number of training points, n, and the number basis functions used in the kernel approximation, m. Our central contributions include an unbiased stochastic estimator of the evidence lower bound (ELBO) for a Gaussian likelihood, as well as a stochastic estimator that lower bounds the ELBO for several other likelihoods such as Laplace and logistic. Independence of the stochastic optimization update complexity on n and m enables inference on huge datasets using large capacity GP models. We demonstrate accurate inference on large classification and regression datasets using GPs and relevance vector machines with up to m = 10^7 basis functions.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset