Section: New Results
Optimal Algorithms for Non-Smooth Distributed Optimization in Networks
In this work, we consider the distributed optimization of non-smooth convex
functions using a network of computing units. We investigate this problem
under two regularity assumptions: (1) the Lipschitz continuity of the global
objective function, and (2) the Lipschitz continuity of local individual
functions. Under the local regularity assumption, we provide the first
optimal first-order decentralized algorithm called multi-step primal-dual
(MSPD) and its corresponding optimal convergence rate. A notable aspect of
this result is that, for non-smooth functions, while the dominant term of
the error is in