EN FR
EN FR
Overall Objectives
New Software and Platforms
New Results
Bilateral Contracts and Grants with Industry
Bibliography
Overall Objectives
New Software and Platforms
New Results
Bilateral Contracts and Grants with Industry
Bibliography


Section: New Results

Finite-sample Analysis of M-estimators using Self-concordance

In [50], we demonstrate how self-concordance of the loss allows to obtain asymptotically optimal rates for M-estimators in finite-sample regimes. We consider two classes of losses: (i) self-concordant losses, i.e., whose third derivative is uniformly bounded with the 3/2 power of the second; (ii) pseudo self-concordant losses, for which the power is removed. These classes contain some losses arising in generalized linear models, including the logistic loss; in addition, the second class includes some common pseudo-Huber losses. Our results consist in establishing the critical sample size sufficient to reach the asymptotically optimal excess risk in both cases. Denoting d the parameter dimension, and de the effective dimension taking into account possible model misspecification, we find the critical sample size to be O(de·d) for the first class of losses, and O(ρ·de·d) for the second class, where ρ is the problem-dependent parameter that characterizes the risk curvature at the best predictor θ*. In contrast to the existing results, we only impose local assumptions on the data distribution, assuming that the calibrated design, i.e., the design scaled with the square root of the second derivative of the loss, is subgaussian at the best predictor. Moreover, we obtain the improved bounds on the critical sample size, scaling near-linearly in max(de,d), under the extra assumption that the calibrated design is subgaussian in the Dikin ellipsoid of θ*. Motivated by these findings, we construct canonically self-concordant analogues of the Huber and logistic losses with improved statistical properties. Finally, we extend some of the above results to 1-penalized M-estimators in high-dimensional setups.