Year
2020
Authors
ALQUIER Pierre, CHERIEF-ABDELLATIF Badr-Eddine
Abstract
In some misspecified settings, the posterior distribution in Bayesian statistics may lead to inconsistent estimates. To fix this issue, it has been suggested to replace the likelihood by a pseudo-likelihood, that is the exponential of a loss function enjoying suitable robustness properties. In this paper, we build a pseudo-likelihood based on the Maximum Mean Discrepancy, dened via an embedding of probability distributions into a reproducing kernel Hilbert space. We show that this MMD-Bayes posterior is consistent and robust to model misspecication. As the posterior obtained in this way might be intractable, we also prove that reasonable variational approximations of this posterior enjoy the same properties. We provide details on a stochastic gradient algorithm to compute these variational approximations. Numerical simulations indeed suggest that our estimator is more robust to misspecication than the ones based on the likelihood.
CHERIEF-ABDELLATIF, B.E. et ALQUIER, P. (2020). MMD-Bayes: Robust Bayesian Estimation via Maximum Mean Discrepancy. Dans: 2nd Symposium on Advances in Approximate Bayesian Inference (AABI’19). Proceedings of Machine Learning Research.