Section: New Results
Simpler PAC-Bayesian Bounds for Hostile Data
Participant : Benjamin Guedj.
An original and much simpler way of deriving PAC-Bayesian bounds has been introduced through the use of -divergences (therefore generalizing earlier works on Renyi's divergence and Kullback-Leibler divergence). This work is published in Machine Learning [13].
It a joint work with Pierre Alquier from ENSAE - Université Paris-Saclay.