EM for mixtures - Initialization requires special careReportar como inadecuado

EM for mixtures - Initialization requires special care - Descarga este documento en PDF. Documentación en PDF para descargar gratis. Disponible también para leer online.

1 LSTA - Laboratoire de Statistique Théorique et Appliquée 2 SELECT - Model selection in statistical learning Inria Saclay - Ile de France, LMO - Laboratoire de Mathématiques d-Orsay, CNRS - Centre National de la Recherche Scientifique : UMR

Abstract : Maximum likelihood through the EM algorithm is widely used to estimate the parameters in hidden structure models such as Gaussian mixture models. But the EM algorithm has well-documented drawbacks: its solution could be highly dependent from its initial position and it may fail as a result of degeneracies. We stress the practical dangers of theses limitations and how carefully they should be dealt with. Our main conclusion is that no method enables to address them satisfactory in all situations. But improvements are in-troduced by, first, using a penalized loglikelihood of Gaussian mixture models in a Bayesian regularization perspective and, second, choosing the best among several relevant initialisation strategies. In this perspective, we also propose new recursive initialization strategies which prove helpful. They are compared with standard initialization procedures through numerical experiments and their effects on model selection criteria are analyzed.

Keywords : model selection criteria regularized likelihood recursive initialization EM algorithm initialization strategies Gaussian mixture models

Autor: Jean-Patrick Baudry - Gilles Celeux -

Fuente: https://hal.archives-ouvertes.fr/


Documentos relacionados