Conjugate Gradients and Accelerated Methods Unified: The Approximate Duality Gap View

06/29/2019
by   Jelena Diakonikolas, et al.
0

This note provides a novel, simple analysis of the method of conjugate gradients for the minimization of convex quadratic functions. In contrast with standard arguments, our proof is entirely self-contained and does not rely on the existence of Chebyshev polynomials. Another advantage of our development is that it clarifies the relation between the method of conjugate gradients and general accelerated methods for smooth minimization by unifying their analyses within the framework of the Approximate Duality Gap Technique that was introduced by the authors.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset