Abstract
The hidden Markov model (HMM) has long been one of the most commonly used probability graph models for modeling sequential or time series data. It has been widely used in many fields ranging from speech recognition, face recognition, anomaly detection, to gene function prediction. In this paper, we theoretically propose a variant of the continuous HMM for modeling positive sequential data which are naturally generated in many real-life applications. In contrast with conventional HMMs which often use Gaussian distributions or Gaussian mixture models as the emission probability density, we adopt the inverted Dirichlet mixture model as the emission density to build the HMM. The consideration of inverted Dirichlet mixture model in our case is motivated by its superior modeling capability over Gaussian mixture models for modeling positive data according to several recent studies. In addition, we develop a convergence-guaranteed approach to learning the proposed inverted Dirichlet-based HMM through variational Bayes inference. The effectiveness of the proposed HMM is validated through both synthetic data sets and a real-world application regarding anomaly network intrusion detection. Based on the experimental results, the proposed inverted Dirichlet-based HMM is able to achieve the detection accuracy rates that are about 4%9% higher than those ones obtained by the compared approaches.
Author supplied keywords
Cite
CITATION STYLE
Wang, R., & Fan, W. (2019). Positive Sequential Data Modeling Using Continuous Hidden Markov Models Based on Inverted Dirichlet Mixtures. IEEE Access, 7, 172341–172349. https://doi.org/10.1109/ACCESS.2019.2956477
Register to see more suggestions
Mendeley helps you to discover research relevant for your work.