Composable bounds on information flow from distribution differences

1Citations
Citations of this article
4Readers
Mendeley users who have this article in their library.
Get full text

Abstract

We define information leakage in terms of a “difference” between the a priori distribution over some remote behavior and the a posteriori distribution of the remote behavior conditioned on a local observation from a protocol run. Either a maximum or an average may be used. We identify a set of notions of “difference;” we show that they reduce our general leakage notion to various definitions in the literature. We also prove general composability theorems analogous to the data-processing inequality for mutual information, or cascading channels for channel capacities.

Cite

CITATION STYLE

APA

Ando, M., & Guttman, J. D. (2016). Composable bounds on information flow from distribution differences. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 9481, pp. 13–29). Springer Verlag. https://doi.org/10.1007/978-3-319-29883-2_2

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free