Some new measures of entropy, useful tools in biocomputing

0Citations
Citations of this article
6Readers
Mendeley users who have this article in their library.
Get full text

Abstract

The basic problem rooted in Information Theory (IT) foundations (Shannon, Bell Syst Tech J 27:379-423 and 623-656, 1948; Volkenstein, Entropy and Information. Series: Progress in Mathematical Physics, 2009) is to reconstruct, as closely as possible, the input signal after observing the received output signal. The Shannon information measure is the only possible one in this context, but it must be clear that it is only valid within the more restricted scope of coding problems that C. E. Shannon himself had seen in his lifetime (Shannon, Bell Syst Tech J 27:379-423 and 623-656, 1948). As pointed out by Alfred Rényi (1961), in his essential paper (Rényi, Proc. of the 4th Berkeley Symposium on Mathematics, Statistics and Probability, 547-561, 1961) on generalized information measures, for other sorts of problems other quantities may serve just as well as measures of information, or even better. This would be supported either by their operational significance or by a set of natural postulates characterizing them, or preferably by both. Thus, the idea of generalized entropies arises in scientific literature. We analyze here some new measures of Entropy, very useful to be applied on Biocomputing (Ulanowicz and Hannon, Proc R Soc Lond B 232:181-192, 1987; Volkenstein, Entropy and Information. Series: Progress in Mathematical Physics, 2009). © 2010 Springer Science+Business Media, LLC.

Cite

CITATION STYLE

APA

Garrido, A. (2010). Some new measures of entropy, useful tools in biocomputing. In Advances in Experimental Medicine and Biology (Vol. 680, pp. 745–750). https://doi.org/10.1007/978-1-4419-5913-3_83

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free