Section: New Results
Accelerated decentralized optimization with local updates for smooth and strongly convex objectives
We study the problem of minimizing a sum of smooth and strongly convex functions split over the nodes of a network in a decentralized fashion. We propose the algorithm , a decentralized accelerated algorithm that only requires local synchrony. Its rate depends on the condition number of the local functions as well as the network topology and delays. Under mild assumptions on the topology of the graph, takes a time to reach a precision where is the spectral gap of the graph, the maximum communication delay and the maximum computation time. Therefore, it matches the rate of , which is optimal when . Applying to quadratic local functions leads to an accelerated randomized gossip algorithm of rate where is the rate of the standard randomized gossip. To the best of our knowledge, it is the first asynchronous gossip algorithm with a provably improved rate of convergence of the second moment of the error. We illustrate these results with experiments in idealized settings.