Restarted Nonconvex Accelerated Gradient Descent: No More Polylogarithmic Factor in the O(ε^-7/4) Complexity

01/27/2022
by   Huan Li, et al.
0

This paper studies the accelerated gradient descent for general nonconvex problems under the gradient Lipschitz and Hessian Lipschitz assumptions. We establish that a simple restarted accelerated gradient descent (AGD) finds an ϵ-approximate first-order stationary point in O(ϵ^-7/4) gradient computations with simple proofs. Our complexity does not hide any polylogarithmic factors, and thus it improves over the state-of-the-art one by the O(log1/ϵ) factor. Our simple algorithm only consists of Nesterov's classical AGD and a restart mechanism, and it does not need the negative curvature exploitation or the optimization of regularized surrogate functions. Technically, our simple proof does not invoke the analysis for the strongly convex AGD, which is crucial to remove the O(log1/ϵ) factor.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset