Convergence of Eigenvector Continuation

04/16/2020
by   Avik Sarkar, et al.
0

Eigenvector continuation is a computational method that finds the extremal eigenvalues and eigenvectors of a Hamiltonian matrix with one or more control parameters. It does this by projection onto a subspace of eigenvectors corresponding to selected training values of the control parameters. The method has proven to be very efficient and accurate for interpolating and extrapolating eigenvectors. However, almost nothing is known about how the method converges, and its rapid convergence properties have remained mysterious. In this letter we present the first study of the convergence of eigenvector continuation. In order to perform the mathematical analysis, we introduce a new variant of eigenvector continuation that we call vector continuation. We first prove that eigenvector continuation and vector continuation have identical convergence properties and then analyze the convergence of vector continuation. Our analysis shows that, in general, eigenvector continuation converges more rapidly than perturbation theory. The faster convergence is achieved by eliminating a phenomenon that we call differential folding, the interference between non-orthogonal vectors appearing at different orders in perturbation theory. From our analysis we can predict how eigenvector continuation converges both inside and outside the radius of convergence of perturbation theory. While eigenvector continuation is a non-perturbative method, we show that its rate of convergence can be deduced from power series expansions of the eigenvectors. Our results also yield new insights into the nature of divergences in perturbation theory.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset