In information theory, the fundamental tool is the entropy function, whose upper bound is derived by the use of Jensen Inequality. In this paper, we extend the Jensen Inequality and apply it to derive some useful lower bounds for various entropy measures of discrete random variables.
Dragomir, S. S., & Goh, C. J. (1997). Some bounds on entropy measures in information theory. Applied Mathematics Letters, 10(3), 23–28. https://doi.org/10.1016/S0893-9659(97)00028-1