EN FR
EN FR
Overall Objectives
New Software and Platforms
Bilateral Contracts and Grants with Industry
Bibliography
Overall Objectives
New Software and Platforms
Bilateral Contracts and Grants with Industry
Bibliography


Section: New Results

SAGA: A Fast Incremental Gradient Method With Support for Non-Strongly Convex Composite Objectives

Participants : Simon Lacoste-Julien, Francis Bach.

In this work we introduce a new optimisation method called SAGA in the spirit of SAG, SDCA, MISO and SVRG, a set of recently proposed incremental gradient algorithms with fast linear convergence rates. SAGA improves on the theory behind SAG and SVRG, with better theoretical convergence rates, and has support for composite objectives where a proximal operator is used on the regulariser. Unlike SDCA, SAGA supports non-strongly convex problems directly, and is adaptive to any inherent strong convexity of the problem. Moreover, the proof of the convergence bounds is much simpler than the one of our earlier work SAG. (in collaboration with A. Defazio, ANU)