Homomorphic Encryption-Based Federated Privacy Preservation for Deep Active Learning

11Citations
Citations of this article
15Readers
Mendeley users who have this article in their library.

Abstract

Active learning is a technique for maximizing performance of machine learning with minimal labeling effort and letting the machine automatically and adaptively select the most informative data for labeling. Since the labels on records may contain sensitive information, privacy-preserving mechanisms should be integrated into active learning. We propose a privacy-preservation scheme for active learning using homomorphic encryption-based federated learning. Federated learning provides distributed computation from multiple clients, and homomorphic encryption enhances the privacy preservation of user data with a strong security level. The experimental result shows that the proposed homomorphic encryption-based federated learning scheme can preserve privacy in active learning while maintaining model accuracy. Furthermore, we also provide a Deep Leakage Gradient comparison. The proposed scheme has no gradient leakage compared to the related schemes that have more than 74% gradient leakage.

Cite

CITATION STYLE

APA

Kurniawan, H., & Mambo, M. (2022). Homomorphic Encryption-Based Federated Privacy Preservation for Deep Active Learning. Entropy, 24(11). https://doi.org/10.3390/e24111545

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free