The conjugate gradient algorithm on a general class of spiked covariance matrices

06/25/2021
by   Xiucai Ding, et al.
0

We consider the conjugate gradient algorithm applied to a general class of spiked sample covariance matrices. The main result of the paper is that the norms of the error and residual vectors at any finite step concentrate on deterministic values determined by orthogonal polynomials with respect to a deformed Marchenko–Pastur law. The first-order limits and fluctuations are shown to be universal. Additionally, for the case where the bulk eigenvalues lie in a single interval we show a stronger universality result in that the asymptotic rate of convergence of the conjugate gradient algorithm only depends on the support of the bulk, provided the spikes are well-separated from the bulk. In particular, this shows that the classical condition number bound for the conjugate gradient algorithm is pessimistic for spiked matrices.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset