EN FR
EN FR
Overall Objectives
New Software and Platforms
New Results
Bilateral Contracts and Grants with Industry
Bibliography
Overall Objectives
New Software and Platforms
New Results
Bilateral Contracts and Grants with Industry
Bibliography


Section: New Results

Uniform regret bounds over Rd for the sequential linear regression problem with the square loss

In [45] we consider the setting of online linear regression for arbitrary deterministic sequences, with the square loss. We are interested in obtaining regret bounds that hold uniformly over all vectors Rd. When the feature sequence is known at the beginning of the game, they provided closed-form regret bounds of 2dB2lnT+O(1), where T is the number of rounds and B is a bound on the observations. Instead, we derive bounds with an optimal constant of 1 in front of the dB2lnT term. In the case of sequentially revealed features, we also derive an asymptotic regret bound of dB2lnT for any individual sequence of features and bounded observations. All our algorithms are variants of the online nonlinear ridge regression forecaster, either with a data-dependent regularization or with almost no regularization.