Some bounds on entropy measures in information theory

39Citations
Citations of this article
10Readers
Mendeley users who have this article in their library.

Abstract

In information theory, the fundamental tool is the entropy function, whose upper bound is derived by the use of Jensen Inequality. In this paper, we extend the Jensen Inequality and apply it to derive some useful lower bounds for various entropy measures of discrete random variables.

Cite

CITATION STYLE

APA

Dragomir, S. S., & Goh, C. J. (1997). Some bounds on entropy measures in information theory. Applied Mathematics Letters, 10(3), 23–28. https://doi.org/10.1016/S0893-9659(97)00028-1

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free