Year
2018
Authors
ALQUIER Pierre, CHERIEF-ABDELLATIF Badr-Eddine
Abstract
Mixture models are widely used in Bayesian statistics and machine learning, in particular in computational biology, natural language processing and many other fields. Variational inference, a technique for approximating intractable posteriors thanks to optimization algorithms, is extremely popular in practice when dealing with complex models such as mixtures. The contribution of this paper is two-fold. First, we study the concentration of variational approximations of posteriors, which is still an open problem for general mixtures, and we derive consistency and rates of convergence. We also tackle the problem of model selection for the number of components: we study the approach already used in practice, which consists in maximizing a numerical criterion (the Evidence Lower Bound). We prove that this strategy indeed leads to strong oracle inequalities. We illustrate our theoretical results by applications to Gaussian and multinomial mixtures.
CHERIEF-ABDELLATIF, B.E. et ALQUIER, P. (2018). Consistency of variational Bayes inference for estimation and model selection in mixtures. The Electronic Journal of Statistics, 12(2), pp. 2995-3035.