Entropy and information of open systems

5Citations
Citations of this article
8Readers
Mendeley users who have this article in their library.

Abstract

Of the two definitions of information given by Shannon, one is identical to that of Boltzmann's entropy and gives in fact a measure of statistical uncertainty. The other involves the difference of unconditional and conditional entropies and, if properly specified, allows to introduce a measure of information for an open system depending on the values of the system' control parameters. Two classes of systems are identified. For those in the first class, an equilibrium state is possible and the law of conversation of information and entropy holds. When in equilibrium, such systems have zero information and maximum entropy. In self-organization processes, information increases away from the equilibrium state. For the systems of the other class, no equilibrium is possible. For these, the so-called 'chaoticity norm' is introduced and also two kinds of self- organization processes are considered and the concept of information is appropriately defined. Common information definitions are applied to classical and quantum physical systems as well as to medical and biological systems.

Cite

CITATION STYLE

APA

Klimontovich, Y. L. (1999). Entropy and information of open systems. Uspekhi Fizicheskikh Nauk, 169(4), 452. https://doi.org/10.3367/ufnr.0169.199904e.0443

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free