We describe a class of deterministic weakly chaotic dynamical systems with infinite memory. These "herding systems" combine learning and inference into one algorithm, where moments or data-items are converted directly into an arbitrarily long sequence of pseudo-samples. This sequence has infinite range correlations and as such is highly structured. We show that its information content, as measured by sub-extensive entropy, can grow as fast as K log T, which is faster than the usual 1/2K log T for exchangeable sequences generated by random posterior sampling from a Bayesian model. In one dimension we prove that herding sequences are equivalent to Sturmian sequences which have complexity exactly log(T + 1). More generally, we advocate the application of the rich theoretical framework around nonlinear dynamical systems, chaos theory and fractal geometry to statistical learning. © 2010 IOP Publishing Ltd.
CITATION STYLE
Welling, M., & Chen, Y. (2010). Statistical inference using weak chaos and infinite memory. In Journal of Physics: Conference Series (Vol. 233). Institute of Physics Publishing. https://doi.org/10.1088/1742-6596/233/1/012005
Mendeley helps you to discover research relevant for your work.