Abstract
We derive inequalities of the form Δ(P, Q) ≤ H(P\R) + H(Q\R) which hold for every choice of probability measures P, Q, R, where H(P\R) denotes the relative entropy of P with respect to R and Δ(P, Q) stands for a coupling type "distance" between P and Q. Using the chain rule for relative entropies and then specializing to Q with a given support we recover some of Talagrand's concentration of measure inequalities for product spaces.
Author supplied keywords
Cite
CITATION STYLE
APA
Dembo, A. (1997). Information inequalities and concentration of measure. Annals of Probability, 25(2), 927–939. https://doi.org/10.1214/aop/1024404424
Register to see more suggestions
Mendeley helps you to discover research relevant for your work.
Already have an account? Sign in
Sign up for free