Refinement of Jensen’s inequality and estimation of f- and Rényi divergence via Montgomery identity

20Citations
Citations of this article
5Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

Jensen’s inequality is important for obtaining inequalities for divergence between probability distribution. By applying a refinement of Jensen’s inequality (Horváth et al. in Math. Inequal. Appl. 14:777–791, 2011) and introducing a new functional based on an f-divergence functional, we obtain some estimates for the new functionals, the f-divergence, and Rényi divergence. Some inequalities for Rényi and Shannon estimates are constructed. The Zipf–Mandelbrot law is used to illustrate the result. In addition, we generalize the refinement of Jensen’s inequality and new inequalities of Rényi Shannon entropies for an m-convex function using the Montgomery identity. It is also given that the maximization of Shannon entropy is a transition from the Zipf–Mandelbrot law to a hybrid Zipf–Mandelbrot law.

Cite

CITATION STYLE

APA

Khan, K. A., Niaz, T., Pec̆arić, Ð., & Pec̆arić, J. (2018). Refinement of Jensen’s inequality and estimation of f- and Rényi divergence via Montgomery identity. Journal of Inequalities and Applications, 2018. https://doi.org/10.1186/s13660-018-1902-9

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free