## Section: New Results

### Accelerated decentralized optimization with local updates for smooth and strongly convex objectives

We study the problem of minimizing a sum of smooth and strongly convex functions split over the nodes of a network in a decentralized fashion. We propose the algorithm $ESDACD$, a decentralized accelerated algorithm that only requires local synchrony. Its rate depends on the condition number $\kappa $ of the local functions as well as the network topology and delays. Under mild assumptions on the topology of the graph, $ESDACD$ takes a time $O(({\tau}_{max}+{\Delta}_{max})\sqrt{\kappa /\gamma}ln\left({\u03f5}^{-1}\right))$ to reach a precision $\u03f5$ where $\gamma $ is the spectral gap of the graph, ${\tau}_{max}$ the maximum communication delay and ${\Delta}_{max}$ the maximum computation time. Therefore, it matches the rate of $SSDA$, which is optimal when ${\tau}_{max}=\Omega \left({\Delta}_{max}\right)$. Applying $ESDACD$ to quadratic local functions leads to an accelerated randomized gossip algorithm of rate $O\left(\sqrt{{\theta}_{\mathrm{gossip}}/n}\right)$ where ${\theta}_{\mathrm{gossip}}$ is the rate of the standard randomized gossip. To the best of our knowledge, it is the first asynchronous gossip algorithm with a provably improved rate of convergence of the second moment of the error. We illustrate these results with experiments in idealized settings.