Mathématiques et Informatique Appliquées
du Génome à l'Environnement

 

 

Lundi 2 novembre 2020

Séminaire
Intervening organization
Université de Lille - INRIA, équipe projet Modal
Name of intervener
Yaroslav Averyanov
Title
Early stopping in regression with reproducing kernels: some ideas towards optimality
Abstract
In this talk, I will discuss how to understand the behavior of early stopping for iterative learning algorithms in reproducing kernel Hilbert space in the nonparametric regression framework. In particular, I will focus on celebrated gradient descent and (iterative) kernel ridge regression algorithms. It is widely known that nonparametric models offer great flexibility for the user however they tend to overfit. Thus, some form of regularisation is needed - this is why early stopping can help us. More precisely, I will show how to construct a data-driven stopping rule without a validation set. This rule will be based on the so-called minimum discrepancy principle, which is a technique borrowed from the inverse problem literature. The proposed rule appeared to be minimax optimal over different types of kernel spaces, including finite rank and Sobolev smoothness classes. Besides that, simulated experiments will be discussed as well that show the comparable performance of the new strategy with respect to some extensively used model selection methods.
Place
Salle de réunion 142, bâtiment 210
Date of the day