Scalable computation of predictive probabilities in probit models with Gaussian process priors

09/03/2020
by   Jian Cao, et al.
0

Predictive models for binary data are fundamental in various fields, and the growing complexity of modern applications has motivated several flexible specifications for modeling the relationship between the observed predictors and the binary responses. A widely-implemented solution expresses the probability parameter via a probit mapping of a Gaussian process indexed by predictors. However, unlike for continuous settings, there is a lack of closed-form results for predictive distributions in binary models with Gaussian process priors. Markov chain Monte Carlo methods and approximation strategies provide common solutions to this problem, but state-of-the-art algorithms are either computationally intractable or inaccurate in moderate-to-high dimensions. In this article, we aim to cover this gap by deriving closed-form expressions for the predictive probabilities in probit Gaussian processes that rely either on cumulative distribution functions of multivariate Gaussians or on functionals of multivariate truncated normals. To evaluate such quantities we develop novel scalable solutions based on tile-low-rank Monte Carlo methods for computing multivariate Gaussian probabilities and on variational approximations of multivariate truncated normals. Closed-form expressions for marginal likelihoods and posterior distributions of the Gaussian process are also discussed. As illustrated in empirical studies, the proposed methods scale to dimensions where state-of-the-art solutions are impractical.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset