Abstract
There is a simple inequality by Pinsker between variational distance and informational divergence of probability measures defined on arbitrary probability spaces. We shall consider probability measures on sequences taken from countable alphabets, and derive, from Pinsker's inequality, bounds on the d̄-distance by informational divergence. Such bounds can be used to prove the "concentration of measure" phenomenon for some non product distributions.
Author supplied keywords
Cite
CITATION STYLE
Marton, K. (1996). Bounding d̄-distance by informational divergence: A method to prove measure concentration. Annals of Probability, 24(2), 857–866. https://doi.org/10.1214/aop/1039639365
Register to see more suggestions
Mendeley helps you to discover research relevant for your work.