Minimum divergence estimators are derived through the dual form of the divergence in parametric models. These estimators generalize the classical maximum likelihood ones. Models with unobserved data, as mixture models, can be estimated with EM algorithms, which are proved to converge to stationary points of the likelihood function under general assumptions. This paper presents an extension of the EM algorithm based on minimization of the dual approximation of the divergence between the empirical measure and the model using a proximaltype algorithm. The algorithm converges to the stationary points of the empirical criterion under general conditions pertaining to the divergence and the model. Robustness properties of this algorithm are also presented. We provide another proof of convergence of the EM algorithm in a two-component gaussian mixture. Simulations on Gaussian andWeibull mixtures are performed to compare the results with the MLE.
CITATION STYLE
Al Mohamad, D., & Broniatowski, M. (2015). Generalized EM algorithms for minimum divergence estimation. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 9389, pp. 417–426). Springer Verlag. https://doi.org/10.1007/978-3-319-25040-3_45
Mendeley helps you to discover research relevant for your work.