Acceleration of stochastic methods on the example of decentralized SGD

11/15/2020
by   Trimbach Ekaterina, et al.
0

In this paper, we present an algorithm for accelerating decentralized stochastic gradient descent. Recently, decentralized stochastic optimization methods have attracted a lot of attention, mainly due to their low iteration cost, data locality and data exchange efficiency. They are generalizations of algorithms such as SGD and Local SGD. An additional important contribution of this work is the additions to the analysis of acceleration of stochastic methods, which allows achieving acceleration in the decentralized case.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset

Sign in with Google

×

Use your Google Account to sign in to DeepAI

×

Consider DeepAI Pro