Intrusion Detection Based on Privacy-Preserving Federated Learning for the Industrial IoT

128Citations
Citations of this article
126Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

Federated learning (FL) has attracted significant interest given its prominent advantages and applicability in many scenarios. However, it has been demonstrated that sharing updated gradients/weights during the training process can lead to privacy concerns. In the context of the Internet of Things (IoT), this can be exacerbated due to intrusion detection systems (IDSs), which are intended to detect security attacks by analyzing the devices' network traffic. Our work provides a comprehensive evaluation of differential privacy techniques, which are applied during the training of an FL-enabled IDS for industrial IoT. Unlike previous approaches, we deal with nonindependent and identically distributed data over the recent ToN_IoT dataset, and compare the accuracy obtained considering different privacy requirements and aggregation functions, namely FedAvg and the recently proposed Fed+. According to our evaluation, the use of Fed+ in our setting provides similar results even when noise is included in the federated training process.

Cite

CITATION STYLE

APA

Ruzafa-Alcazar, P., Fernandez-Saura, P., Marmol-Campos, E., Gonzalez-Vidal, A., Hernandez-Ramos, J. L., Bernal-Bernabe, J., & Skarmeta, A. F. (2023). Intrusion Detection Based on Privacy-Preserving Federated Learning for the Industrial IoT. IEEE Transactions on Industrial Informatics, 19(2), 1145–1154. https://doi.org/10.1109/TII.2021.3126728

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free