Retour aux résultats
Articles (2018), The Electronic Journal of Statistics, 12 (2), pp. 2995-3035

Consistency of variational Bayes inference for estimation and model selection in mixtures

CHERIEF-ABDELLATIF Badr-Eddine, ALQUIER Pierre

Mixture models are widely used in Bayesian statistics and machine learning, in particular in computational biology, natural language processing and many other fields. Variational inference, a technique for approximating intractable posteriors thanks to optimization algorithms, is extremely popular in practice when dealing with complex models such as mixtures. The contribution of this paper is two-fold. First, we study the concentration of variational approximations of posteriors, which is still an open problem for general mixtures, and we derive consistency and rates of convergence. We also tackle the problem of model selection for the number of components: we study the approach already used in practice, which consists in maximizing a numerical criterion (the Evidence Lower Bound). We prove that this strategy indeed leads to strong oracle inequalities. We illustrate our theoretical results by applications to Gaussian and multinomial mixtures. Lien vers l'article

CHERIEF-ABDELLATIF, B.E. and ALQUIER, P. (2018). Consistency of variational Bayes inference for estimation and model selection in mixtures. The Electronic Journal of Statistics, 12(2), pp. 2995-3035.

Mots clés : #Mixture-models, #frequentist-evaluation-of-Bayesian-methods, #variational-approximations, #model-selection