Lundi 12 décembre 2022

Organisme intervenant (ou équipe pour les séminaires internes)
Télécom SudParis
Nom intervenant
Sholom Schechtman
Stochastic subgradient descent on weakly convex functions escapes active strict saddles

In the first part of this presentation we will introduce the notion of an active strict saddle and will show that this notion is generic in the class of semi-algebraic functions.
In the second part, we will show that the stochastic subgradient descent  (SGD) on a weakly convex function avoids active strict saddles with probability one. As a consequence, ''generically'' on a weakly convex, semialgebraic function, the SGD converges to a local minimum.

Salle de réunion 142, bâtiment 210
Date du jour