The nonextensive entropy of Tsallis can be seen as a consequence of postulates on a self-information, i.e., the constant ratio of the first derivative of a self-information per unit probability to the curvature (second variation) of it. This constancy holds if we regard the probability distribution as the gradient of a self-information. Considering the form of the nth derivative of a self- information with keeping this constant ratio, we arrive at the general class of nonextensive entropies. Some properties on the series of entropies constructed by this picture are investigated.
CITATION STYLE
Yamano, T. (2004). On a simple derivation of a family of nonextensive entropies from information content. Entropy, 6(4), 364–374. https://doi.org/10.3390/e6040364
Mendeley helps you to discover research relevant for your work.