Robust statistical procedures are designed to remain stable under model misspecification or contamination, a crucial issue in modern applications. After a brief introduction to robust estimation, we present rho-estimators, which are constructed from a modification of the likelihood ratio test and exhibit strong robustness properties.
We then investigate their performance in two important classes of models. We first consider mixture models, where robustness is particularly relevant due to sensitivity to outliers and model deviations. We next study hidden Markov models (HMMs), which require extending the original rho-estimation framework beyond the independent setting. We show that this extension is not specific to HMMs and applies more broadly to dependent data models, including Langevin diffusions.