Differential Privacy for Statistical Data of Educational Institutions

0Citations
Citations of this article
2Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Electronic methods of managing the educational process are gaining popularity. Recently, a large number of user programs have appeared for such accounting. Based on this, the issue of personal data protection requires increased attention. The coronavirus pandemic has led to a significant increase in the amount of data distributed remotely, which requires information security for a wider range of workers on a continuous basis. In this article, we will consider such a relatively new mechanism designed to help protect personal data as differential privacy. Differential privacy is a way of strictly mathematical definition of possible risks in public access to sensitive data. Based on estimating the probabilities of possible data losses, you can build the right policy to “noise” publicly available statistics. This approach will make it possible to find a compromise between the preservation of general patterns in the data and the security of the personal data of the participants in the educational process.

Cite

CITATION STYLE

APA

Podsevalov, I., Podsevalov, A., & Korkhov, V. (2022). Differential Privacy for Statistical Data of Educational Institutions. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 13380 LNCS, pp. 603–615). Springer Science and Business Media Deutschland GmbH. https://doi.org/10.1007/978-3-031-10542-5_41

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free