A Probabilistic Numerical Extension of the Conjugate Gradient Method

08/07/2020
by   Tim W. Reid, et al.
0

We present a Conjugate Gradient (CG) implementation of the probabilistic numerical solver BayesCG, whose error estimates are a fully integrated design feature, easy to compute, and competitive with the best existing estimators. More specifically, we extend BayesCG to singular prior covariances, derive recursions for the posterior covariances, express the posteriors as projections, and establish that BayesCG retains the minimization properties over Krylov spaces regardless of the singular priors. We introduce a possibly singular Krylov prior covariance, under which the BayesCG posterior means coincide with the CG iterates and the posteriors can be computed efficiently. Because of its factored form, the Krylov prior is amenable to low-rank approximation, which produces an efficient BayesCG implementation as a CG method. We also introduce a probabilistic error estimator, the `S-statistic'. Although designed for sampling from BayesCG posteriors, its mean and variance under approximate Krylov priors can be computed with CG. An approximation of the S-statistic by a `95 percent credible interval' avoids the cost of sampling altogether. Numerical experiments illustrate that the resulting error estimates are competitive with the best existing methods and are easy to compute.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset