Generalized redundancies for time series analysis

  • Prichard D
  • Theiler J
  • 55


    Mendeley users who have this article in their library.
  • 102


    Citations of this article.


Extensions to various information-theoretic quantities (such as entropy, redundancy, and mutual information) are discussed in the context of their role in nonlinear time series analysis. We also discuss "linearized" versions of these quantities and their use as benchmarks in tests for nonlinearity. Many of these quantities can be expressed in terms of the generalized correlation integral, and this expression permits us to more clearly exhibit the relationships of these quantities to each other and to other commonly used nonlinear statistics (such as the BDS and Green-Savit statistics). Further, numerical estimation of these quantities is found to be more accurate and more efficient when the the correlation integral is employed in the computation. Finally, we consider several "local" versions of these quantities, including a local Kolmogorov-Sinai entropy, which gives an estimate of variability of the short-term predictability. © 1995 Elsevier Science B.V. All rights reserved.

Get free article suggestions today

Mendeley saves you time finding and organizing research

Sign up here
Already have an account ?Sign in

Find this document


  • Dean Prichard

  • James Theiler

Cite this document

Choose a citation style from the tabs below

Save time finding and organizing research with Mendeley

Sign up for free