A nonstationary hidden Markov model with approximately infinitely-long time-dependencies

1Citations
Citations of this article
5Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Hidden Markov models (HMMs) are a popular approach for modeling sequential data, typically based on the assumption of a first-order Markov chain. In other words, only one-step back dependencies are modeled which is a rather unrealistic assumption in most applications. In this paper, we propose a method for postulatingHMMs with approximately infinitely-long time-dependencies. Our approach considers the whole history of model states in the postulated dependencies, by making use of a recently proposed nonparametric Bayesian method for modeling label sequences with infinitely-long time dependencies, namely the sequence memoizer. We manage to derive training and inference algorithms for our model with computational costs identical to simple first-order HMMs, despite its entailed infinitely-long time-dependencies, by employing a mean-field-like approximation. The efficacy of our proposed model is experimentally demonstrated.

Cite

CITATION STYLE

APA

Chatzis, S. P., Kosmopoulos, D. I., & Papadourakis, G. M. (2014). A nonstationary hidden Markov model with approximately infinitely-long time-dependencies. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 8888, pp. 51–62). Springer Verlag. https://doi.org/10.1007/978-3-319-14364-4_6

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free