EN FR
EN FR
Overall Objectives
Application Domains
New Software and Platforms
New Results
Bilateral Contracts and Grants with Industry
Bibliography
Overall Objectives
Application Domains
New Software and Platforms
New Results
Bilateral Contracts and Grants with Industry
Bibliography


Section: New Results

Sparse Accelerated Exponential Weights

In [8], we consider the stochastic optimization problem where a convex function is minimized observing recursively the gradients. We introduce SAEW, a new procedure that accelerates exponential weights procedures with the slow rate 1/T to procedures achieving the fast rate 1/T. Under the strong convexity of the risk, we achieve the optimal rate of convergence for approximating sparse parameters in d. The acceleration is achieved by using successive averaging steps in an online fashion. The procedure also produces sparse estimators thanks to additional hard threshold steps.