Résumé
In the first part of this presentation we will introduce the notion of an active strict saddle and will show that this notion is generic in the class of semi-algebraic functions.
In the second part, we will show that the stochastic subgradient descent (SGD) on a weakly convex function avoids active strict saddles with probability one. As a consequence, ''generically'' on a weakly convex, semialgebraic function, the SGD converges to a local minimum.