An Optimal Stochastic Algorithm for Decentralized Nonconvex Finite-sum Optimization

10/25/2022
by   Luo Luo, et al.
0

This paper studies the synchronized decentralized nonconvex optimization problem of the form min_x∈ℝ^d f(x)≜1/m∑_i=1^m f_i(x), where f_i(x)≜1/n∑_j=1^n f_i,j(x) is the local function on i-th agent of the connected network. We propose a novel stochastic algorithm called DEcentralized probAbilistic Recursive gradiEnt deScenT (DEAREST), which integrates the techniques of variance reduction, gradient tracking and multi-consensus. We construct a Lyapunov function that simultaneously characterizes the function value, the gradient estimation error and the consensus error for the convergence analysis. Based on this measure, we provide a concise proof to show DEAREST requires at most 𝒪(mn+√(mn)Lε^-2) incremental first-order oracle (IFO) calls and 𝒪(Lε^-2/√(1-λ_2(W)) ) communication rounds to find an ε-stationary point in expectation, where L is the smoothness parameter and λ_2(W) is the second-largest eigenvalues of the gossip matrix W. We can verify both of the IFO complexity and communication complexity match the lower bounds. To the best of our knowledge, DEAREST is the first optimal algorithm for decentralized nonconvex finite-sum optimization.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset