We derive inequalities of the form Δ(P, Q) ≤ H(P\R) + H(Q\R) which hold for every choice of probability measures P, Q, R, where H(P\R) denotes the relative entropy of P with respect to R and Δ(P, Q) stands for a coupling type "distance" between P and Q. Using the chain rule for relative entropies and then specializing to Q with a given support we recover some of Talagrand's concentration of measure inequalities for product spaces.
CITATION STYLE
Dembo, A. (1997). Information inequalities and concentration of measure. Annals of Probability, 25(2), 927–939. https://doi.org/10.1214/aop/1024404424
Mendeley helps you to discover research relevant for your work.