Abstract
In this paper, we model discrete time series as discrete Markov processes of arbitrary order and derive the approximate distribution of the Kullback-Leibler divergence between a known transition probability matrix and its sample estimate. We introduce two new information-theoretic measurements: information memory loss and information codependence structure. The former measures the memory content within a Markov process and determines its optimal order. The latter assesses the codependence among Markov processes. Both measurements are evaluated on toy examples and applied on high frequency foreign exchange data, focusing on 2008 financial crisis and 2010/2011 Euro crisis.
Author supplied keywords
Cite
CITATION STYLE
Golub, A., Chliamovitch, G., Dupuis, A., & Chopard, B. (2015). Uncovering discrete non-linear dependence with information theory. Entropy, 17(5), 2606–2623. https://doi.org/10.3390/e17052606
Register to see more suggestions
Mendeley helps you to discover research relevant for your work.