Many works in statistics aim at designing a universal estimation procedure, that is, an estimator that would converge to the best approximation of the (unknown) data generating distribution in a model, without any assumption on this distribution. This question is of major interest, in particular because the universality property leads to the robustness of the estimator. In this paper, we tackle the problem of universal estimation using a minimum distance estimator presented in (Briol et al. (2019)) based on the Maximum Mean Discrepancy. We show that the estimator is robust to both dependence and to the presence of outliers in the dataset. Finally, we provide a theoretical study of the stochastic gradient descent algorithm used to compute the estimator, and we support our findings with numerical simulations.
CHERIEF-ABDELLATIF, B.E. et ALQUIER, P. (2022). Finite sample properties of parametric MMD estimation: Robustness to misspecification and dependence. Bernoulli: A Journal of Mathematical Statistics and Probability, 28(1), pp. 181-213.