Hypothesis testing is the fundamental theory behind decision-making and therefore plays a critical role in information systems. A prominent example is machine learning, which is currently developed and applied to a wide range of applications. However, besides the utilities, hypothesis testing can also be implemented for an illegitimate purpose to infer on people’s privacy. Thus, the development of hypothesis testing techniques further increases the privacy leakage risks. Accordingly, the research on privacy-by-design techniques that enhance the privacy against adversarial hypothesis testing receives more and more attention recently. In this chapter, the problem of privacy against adversarial hypothesis testing is formulated in the presence of a distortion source. Information-theoretic fundamental bounds on the optimal privacy performance and corresponding privacy-enhancing technologies are first discussed under the assumption of independent and identically distributed adversarial observations. The discussion is then extended to considering a privacy problem model with memory. In the end, applications of the theoretic results and privacy-enhancing technologies to the smart meter privacy problem are illustrated.
CITATION STYLE
Li, Z., You, Y., & Oechtering, T. J. (2019). Privacy against adversarial hypothesis testing: Theory and application to smart meter privacy problem. In Privacy in Dynamical Systems (pp. 43–64). Springer Singapore. https://doi.org/10.1007/978-981-15-0493-8_3
Mendeley helps you to discover research relevant for your work.