On the convergence speed of MDL predictions for bernoulli sequences

N/ACitations
Citations of this article
28Readers
Mendeley users who have this article in their library.
Get full text

Abstract

We consider the Minimum Description Length principle for online sequence prediction. If the underlying model class is discrete, then the total expected square loss is a particularly interesting performance measure: (a) this quantity is bounded, implying convergence with probability one, and (b) it additionally specifies a rate of convergence. Generally, for MDL only exponential loss bounds hold, as opposed to the linear bounds for a Bayes mixture. We show that this is even the case if the model class contains only Bernoulli distributions. We derive a new upper bound on the prediction error for countable Bernoulli classes. This implies a small bound (comparable to the one for Bayes mixtures) for certain important model classes. The results apply to many Machine Learning tasks including classification and hypothesis testing. We provide arguments that our theorems generalize to countable classes of i.i.d. models. © Springer-Verlag Berlin Heidelberg 2004.

Cite

CITATION STYLE

APA

Poland, J., & Hutter, M. (2004). On the convergence speed of MDL predictions for bernoulli sequences. In Lecture Notes in Artificial Intelligence (Subseries of Lecture Notes in Computer Science) (Vol. 3244, pp. 294–308). Springer Verlag. https://doi.org/10.1007/978-3-540-30215-5_23

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free