Dynamical systems and computable information

24Citations
Citations of this article
20Readers
Mendeley users who have this article in their library.

Abstract

We present some new results that relate information to chaotic dynamics. In our approach the quantity of information is measured by the Algorithmic Information Content (Kolmogorov complexity) or by a sort of computable version of it (Computable Information Content) in which the information is measured by using a suitable universal data compression algorithm. We apply these notions to the study of dynamical systems by considering the asymptotic behavior of the quantity of information necessary to describe their orbits. When a system is ergodic, this method provides an indicator that equals the Kolmogorov-Sinai entropy almost everywhere. Moreover, if the entropy is null, our method gives new indicators that measure the unpredictability of the system and allows various kind of weak chaos to be classified. Actually, this is the main motivation of this work. The behavior of a 0-entropy dynamical system is far to be completely predictable except that in particular cases. In fact there are 0-entropy systems that exhibit a sort of weak chaos, where the information necessary to describe the orbit behavior increases with time more than logarithmically (periodic case) even if less than linearly (positive entropy case). Also, we believe that the above method is useful to classify 0-entropy time series. To support this point of view, we show some theoretical and experimental results in specific cases.

Cite

CITATION STYLE

APA

Benci, V., Bonanno, C., Galatolo, S., Menconi, G., & Virgilio, M. (2004). Dynamical systems and computable information. Discrete and Continuous Dynamical Systems - Series B, 4(4), 935–960. https://doi.org/10.3934/dcdsb.2004.4.935

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free