Generalized entropies, density of states, and non-extensivity

6Citations
Citations of this article
14Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

The concept of entropy connects the number of possible configurations with the number of variables in large stochastic systems. Independent or weakly interacting variables render the number of configurations scale exponentially with the number of variables, making the Boltzmann–Gibbs–Shannon entropy extensive. In systems with strongly interacting variables, or with variables driven by history-dependent dynamics, this is no longer true. Here we show that contrary to the generally held belief, not only strong correlations or history-dependence, but skewed-enough distribution of visiting probabilities, that is, first-order statistics, also play a role in determining the relation between configuration space size and system size, or, equivalently, the extensive form of generalized entropy. We present a macroscopic formalism describing this interplay between first-order statistics, higher-order statistics, and configuration space growth. We demonstrate that knowing any two strongly restricts the possibilities of the third. We believe that this unified macroscopic picture of emergent degrees of freedom constraining mechanisms provides a step towards finding order in the zoo of strongly interacting complex systems.

Cite

CITATION STYLE

APA

Balogh, S. G., Palla, G., Pollner, P., & Czégel, D. (2020). Generalized entropies, density of states, and non-extensivity. Scientific Reports, 10(1). https://doi.org/10.1038/s41598-020-72422-8

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free