Em versus Markov chain monte carlo for estimation of hidden Markov models: A computational perspective

90Citations
Citations of this article
152Readers
Mendeley users who have this article in their library.

Abstract

Hidden Markov models (HMMs) and related models have become standard in statistics during the last 15-20 years, with applications in diverse areas like speech and other statistical signal processing, hydrology, financial statistics and econometrics, bioinformatics etc. Inference in HMMs is traditionally often carried out using the EM algorithm, but examples of Bayesian estimation, in general implemented through Markov chain Monte Carlo (MCMC) sampling are also frequent in the HMM literature. The purpose of this paper is to compare the EM and MCMC approaches in three cases of different complexity; the examples include model order selection, continuous-time HMMs and variants of HMMs in which the observed data depends on many hidden variables in an overlapping fashion. All these examples in some way or another originate from real-data applications. Neither EM nor MCMC analysis of HMMs is a black-box methodology without need for user-interaction, and we will illustrate some of the problems, like poor mixing and long computation times, one may expect to encounter. © 2008 International Society for Bayesian Analysis.

Cite

CITATION STYLE

APA

Rydén, T. (2008). Em versus Markov chain monte carlo for estimation of hidden Markov models: A computational perspective. Bayesian Analysis, 3(4), 659–688. https://doi.org/10.1214/08-BA326

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free