The word entropy and how to compute it

0Citations
Citations of this article
3Readers
Mendeley users who have this article in their library.
Get full text

Abstract

The complexity function of an infinite word counts the number of its factors. For any positive function f, its exponential rate of growth E0(f) is limn→∞inf 1/n log f(n). We define a new quantity, the word entropy EW(f), as the maximal exponential growth rate of a complexity function smaller than f. This is in general smaller than E0(f), and more difficult to compute; we give an algorithm to estimate it. The quantity EW(f) is used to compute the Hausdorff dimension of the set of real numbers whose expansions in a given base have complexity bounded by f.

Author supplied keywords

Cite

CITATION STYLE

APA

Ferenczi, S., Mauduit, C., & Moreira, C. G. (2017). The word entropy and how to compute it. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 10432 LNCS, pp. 157–163). Springer Verlag. https://doi.org/10.1007/978-3-319-66396-8_15

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free