Asynchronous Accelerated Proximal Stochastic Gradient for Strongly Convex Distributed Finite Sums

01/28/2019
by   Hadrien Hendrikx, et al.
0

In this work, we study the problem of minimizing the sum of strongly convex functions split over a network of n nodes. We propose the decentralized and asynchronous algorithm ADFS to tackle the case when local functions are themselves finite sums with m components. ADFS converges linearly when local functions are smooth, and matches the rates of the best known finite sum algorithms when executed on a single machine. On several machines, ADFS enjoys a O (√(n)) or O(n) speed-up depending on the leading complexity term as long as the diameter of the network is not too big with respect to m. This also leads to a √(m) speed-up over state-of-the-art distributed batch methods, which is the expected speed-up for finite sum algorithms. In terms of communication times and network parameters, ADFS scales as well as optimal distributed batch algorithms. As a side contribution, we give a generalized version of the accelerated proximal coordinate gradient algorithm using arbitrary sampling that we apply to a well-chosen dual problem to derive ADFS. Yet, ADFS uses primal proximal updates that only require solving one-dimensional problems for many standard machine learning applications. Finally, ADFS can be formulated for non-smooth objectives with equally good scaling properties. We illustrate the improvement of ADFS over state-of-the-art approaches with simulations.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/20/2020

An Optimal Algorithm for Decentralized Finite Sum Optimization

Modern large-scale finite-sum optimization relies on two key aspects: di...
research
06/25/2018

A Distributed Flexible Delay-tolerant Proximal Gradient Algorithm

We develop and analyze an asynchronous algorithm for distributed convex ...
research
07/26/2022

DADAO: Decoupled Accelerated Decentralized Asynchronous Optimization for Time-Varying Gossips

DADAO is a novel decentralized asynchronous stochastic algorithm to mini...
research
05/27/2019

An Accelerated Decentralized Stochastic Proximal Algorithm for Finite Sums

Modern large-scale finite-sum optimization relies on two key aspects: di...
research
06/25/2020

Dual-Free Stochastic Decentralized Optimization with Variance Reduction

We consider the problem of training machine learning models on distribut...
research
06/17/2022

RECAPP: Crafting a More Efficient Catalyst for Convex Optimization

The accelerated proximal point algorithm (APPA), also known as "Catalyst...
research
09/02/2023

Switch and Conquer: Efficient Algorithms By Switching Stochastic Gradient Oracles For Decentralized Saddle Point Problems

We consider a class of non-smooth strongly convex-strongly concave saddl...

Please sign up or login with your details

Forgot password? Click here to reset