Extensions to various information-theoretic quantities (such as entropy, redundancy, and mutual information) are discussed in the context of their role in nonlinear time series analysis. We also discuss "linearized" versions of these quantities and their use as benchmarks in tests for nonlinearity. Many of these quantities can be expressed in terms of the generalized correlation integral, and this expression permits us to more clearly exhibit the relationships of these quantities to each other and to other commonly used nonlinear statistics (such as the BDS and Green-Savit statistics). Further, numerical estimation of these quantities is found to be more accurate and more efficient when the the correlation integral is employed in the computation. Finally, we consider several "local" versions of these quantities, including a local Kolmogorov-Sinai entropy, which gives an estimate of variability of the short-term predictability. © 1995 Elsevier Science B.V. All rights reserved.
Mendeley saves you time finding and organizing research
Choose a citation style from the tabs below