Section:
New Results
Accelerated Decentralized Optimization with Local Updates for Smooth and Strongly Convex Objectives
In [47], we study the problem of minimizing a sum of smooth and
strongly convex functions split over the nodes of a network in a
decentralized fashion. We propose a decentralized
accelerated algorithm that only requires local synchrony. Its rate
depends on the condition number of the local functions as well
as the network topology and delays. Under mild assumptions on the
topology of the graph, our algorithm takes a time to reach a
precision where is the spectral gap of the graph,
the maximum communication delay and the
maximum computation time. Therefore, it matches the rate of
SSDA, which is optimal when . Applying our algorithm to quadratic local
functions leads to an accelerated randomized gossip algorithm of rate
where is the
rate of the standard randomized gossip. To
the best of our knowledge, it is the first asynchronous gossip algorithm
with a provably improved rate of convergence of the second moment of the
error. We illustrate these results with experiments in idealized settings.