Feature selection for hidden markov models with discrete features

4Citations
Citations of this article
5Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Hidden Markov models (HMMs) are widely used for modeling multivariate time series data. However, all collected data is not always useful for distinguishing between states. In these situations, feature selection should be implemented to save the expense of collecting and processing low utility data. Feature selection for HMMs has been studied but most existing methods assume that the observed data follows a Gaussian distribution. In this paper, a method for simultaneously estimating parameters and selecting features for an HMM with discrete observations is presented. The presented method is an extension of the feature saliency HMM which was originally developed to incorporate feature cost into the feature selection process. Expectation-maximization algorithms are derived for features following a Poisson distribution and features following a discrete non-parametric distribution. The algorithms are evaluated on synthetic data sets and a real-world event detection data set that is composed of both Poisson and non-parametric features.

Cite

CITATION STYLE

APA

Adams, S., & Beling, P. A. (2020). Feature selection for hidden markov models with discrete features. In Advances in Intelligent Systems and Computing (Vol. 1037, pp. 67–82). Springer Verlag. https://doi.org/10.1007/978-3-030-29516-5_7

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free