Entropy, Shannon's measure of information and Boltzmann's H-theorem

51Citations
Citations of this article
60Readers
Mendeley users who have this article in their library.

Abstract

We start with a clear distinction between Shannon's Measure of Information (SMI) and the Thermodynamic Entropy. The first is defined on any probability distribution; and therefore it is a very general concept. On the other hand Entropy is defined on a very special set of distributions. Next we show that the Shannon Measure of Information (SMI) provides a solid and quantitative basis for the interpretation of the thermodynamic entropy. The entropy measures the uncertainty in the distribution of the locations and momenta of all the particles; as well as two corrections due to the uncertainty principle and the indistinguishability of the particles. Finally we show that the H-function as defined by Boltzmann is an SMI but not entropy. Therefore; much of what has been written on the H-theorem is irrelevant to entropy and the Second Law of Thermodynamics.

Cite

CITATION STYLE

APA

Ben-Naim, A. (2017). Entropy, Shannon’s measure of information and Boltzmann’s H-theorem. Entropy, 19(2). https://doi.org/10.3390/e19020048

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free