Generalised entropy and asymptotic complexities of languages

3Citations
Citations of this article
4Readers
Mendeley users who have this article in their library.
Get full text

Abstract

In this paper the concept of asymptotic complexity of languages is introduced. This concept formalises the notion of learnability in a particular environment and generalises Lutz and Fortnow's concepts of predictability and dimension. Then asymptotic complexities in different prediction environments are compared by describing the set of all pairs of asymptotic complexities w.r.t. different environments. A geometric characterisation in terms of generalised entropies is obtained and thus the results of Lutz and Fortnow are generalised. © Springer-Verlag Berlin Heidelberg 2007.

Cite

CITATION STYLE

APA

Kalnishkan, Y., Vovk, V., & Vyugin, M. V. (2007). Generalised entropy and asymptotic complexities of languages. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 4539 LNAI, pp. 293–307). Springer Verlag. https://doi.org/10.1007/978-3-540-72927-3_22

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free