Monotone conditional complexity bounds on future prediction errors

4Citations
Citations of this article
28Readers
Mendeley users who have this article in their library.
Get full text

Abstract

We bound the future loss when predicting any (computably) stochastic sequence online. Solomonoff finitely bounded the total deviation of his universal predictor M from the true distribution p. by the algorithmic complexity of μ. Here we assume we are at a time t > 1 and already observed x = x1...xt. We bound the future prediction performance on xt+Xt+2... by a new variant of algorithmic complexity of μ. given x, plus the complexity of the randomness deficiency of x. The new complexity is monotone in its condition in the sense that this complexity can only decrease if the condition is prolonged. We also briefly discuss potential generalizations to Bayesian model classes and to classification problems. © Springer-Verlag Berlin Heidelberg 2005.

Cite

CITATION STYLE

APA

Chernov, A., & Hutter, M. (2005). Monotone conditional complexity bounds on future prediction errors. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 3734 LNAI, pp. 414–428). https://doi.org/10.1007/11564089_32

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free