Rényi Entropy in Statistical Mechanics

16Citations
Citations of this article
22Readers
Mendeley users who have this article in their library.

Abstract

Rényi entropy was originally introduced in the field of information theory as a parametric relaxation of Shannon (in physics, Boltzmann–Gibbs) entropy. This has also fuelled different attempts to generalise statistical mechanics, although mostly skipping the physical arguments behind this entropy and instead tending to introduce it artificially. However, as we will show, modifications to the theory of statistical mechanics are needless to see how Rényi entropy automatically arises as the average rate of change of free energy over an ensemble at different temperatures. Moreover, this notion is extended by considering distributions for isospectral, non-isothermal processes, resulting in relative versions of free energy, in which the Kullback–Leibler divergence or the relative version of Rényi entropy appear within the structure of the corrections to free energy. These generalisations of free energy recover the ordinary thermodynamic potential whenever isothermal processes are considered.

Cite

CITATION STYLE

APA

Fuentes, J., & Gonçalves, J. (2022). Rényi Entropy in Statistical Mechanics. Entropy, 24(8). https://doi.org/10.3390/e24081080

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free