On Data-Processing and Majorization Inequalities for f-Divergences with Applications

18Citations
Citations of this article
5Readers
Mendeley users who have this article in their library.

Abstract

This paper is focused on the derivation of data-processing and majorization inequalities for f-divergences, and their applications in information theory and statistics. For the accessibility of the material, the main results are first introduced without proofs, followed by exemplifications of the theorems with further related analytical results, interpretations, and information-theoretic applications. One application refers to the performance analysis of list decoding with either fixed or variable list sizes; some earlier bounds on the list decoding error probability are reproduced in a unified way, and new bounds are obtained and exemplified numerically. Another application is related to a study of the quality of approximating a probability mass function, induced by the leaves of a Tunstall tree, by an equiprobable distribution. The compression rates of finite-length Tunstall codes are further analyzed for asserting their closeness to the Shannon entropy of a memoryless and stationary discrete source. Almost all the analysis is relegated to the appendices, which form the major part of this manuscript.

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Cite

CITATION STYLE

APA

Sason, I. (2019). On Data-Processing and Majorization Inequalities for f-Divergences with Applications. Entropy, 21(10). https://doi.org/10.3390/e21101022

Readers' Seniority

Tooltip

PhD / Post grad / Masters / Doc 3

100%

Readers' Discipline

Tooltip

Engineering 2

67%

Computer Science 1

33%

Save time finding and organizing research with Mendeley

Sign up for free