Abstract
Many definitions of entropy are discussed in intimacy with Boltzmann-Gibbs exponential distri- bution and measure theory. Tsallis introduced a definition that collapses to other definitions when parameter ‘q’ reaches unity limit. In this definition, Levy statistics are said to be used instead of Boltzmann distribution. Working with the hypothesis “are we not wrong in being obsessed with the form of entropy functional?” aim was to assess if Tsallis definition can be an alternate basis for Shannon’s information entropy. A mix of analogy, comparison, both inductive and deductive reasoning were employed as research enquiry tools. The process of enquiry touched many fields to which entropy is related, including quantum information theory. The enquiry concluded that entropy additivity is adopted axiomatically in most fields for good reasons, mainly emanating from measure theory, conforming to the view that the entropy has more to do with topology and volume growth rather than statistics and distributions.
Cite
CITATION STYLE
Sambasivam, S., & Bodas, V. (2006). Entropy: Form Follows Function. Issues in Informing Science and Information Technology, 3, 581–600. https://doi.org/10.28945/917
Register to see more suggestions
Mendeley helps you to discover research relevant for your work.