A Riemann–Hilbert approach to the perturbation theory for orthogonal polynomials: Applications to numerical linear algebra and random matrix theory
We establish a new perturbation theory for orthogonal polynomials using a Riemann-Hilbert approach and consider applications in numerical linear algebra and random matrix theory. We show that the orthogonal polynomials with respect to two measures can be effectively compared using the difference of their Stieltjes transforms on a suitably chosen contour. Moreover, when two measures are close and satisfy some regularity conditions, we use the theta functions of a hyperelliptic Riemann surface to derive explicit and accurate expansion formulae for the perturbed orthogonal polynomials. The leading error terms can be fully characterized by the difference of the Stieltjes transforms on the contour. The results are applied to analyze several numerical algorithms from linear algebra, including the Lanczos tridiagonalization procedure, the Cholesky factorization and the conjugate gradient algorithm (CGA). As a case study, we investigate these algorithms applied to a general spiked sample covariance matrix model by considering the eigenvector empirical spectral distribution and its limit, allowing for precise estimates on the algorithms as the number of iterations diverges. For this concrete random matrix model, beyond the first order expansion, we derive a mesoscopic central limit theorem for the associated orthogonal polynomials and other quantities relevant to numerical algorithms.
READ FULL TEXT