Divergent predictive states: The statistical complexity dimension of stationary, ergodic hidden Markov processes

7Citations
Citations of this article
7Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Even simply defined, finite-state generators produce stochastic processes that require tracking an uncountable infinity of probabilistic features for optimal prediction. For processes generated by hidden Markov chains, the consequences are dramatic. Their predictive models are generically infinite state. Until recently, one could determine neither their intrinsic randomness nor structural complexity. The prequel to this work introduced methods to accurately calculate the Shannon entropy rate (randomness) and to constructively determine their minimal (though, infinite) set of predictive features. Leveraging this, we address the complementary challenge of determining how structured hidden Markov processes are by calculating their statistical complexity dimension - the information dimension of the minimal set of predictive features. This tracks the divergence rate of the minimal memory resources required to optimally predict a broad class of truly complex processes.

Cite

CITATION STYLE

APA

Jurgens, A. M., & Crutchfield, J. P. (2021). Divergent predictive states: The statistical complexity dimension of stationary, ergodic hidden Markov processes. Chaos, 31(8). https://doi.org/10.1063/5.0050460

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free