FHDnn: Communication Efficient and Robust Federated Learning for AIoT Networks

11Citations
Citations of this article
8Readers
Mendeley users who have this article in their library.
Get full text

Abstract

The advent of IoT and advances in edge computing inspired federated learning, a distributed algorithm to enable on device learning. Transmission costs, unreliable networks and limited compute power all of which are typical characteristics of IoT networks pose a severe bottleneck for federated learning. In this work we propose FHDnn, a synergetic federated learning framework that combines the salient aspects of CNNs and Hyperdimensional Computing. FHDnn performs hyperdimensional learning on features extracted from a self-supervised contrastive learning framework to accelerate training, lower communication costs, and increase robustness to network errors by avoiding the transmission of the CNN and training only the hyperdimensional component. Compared to CNNs, we show through experiments that FHDnn reduces communication costs by 66X, local client compute and energy consumption by 1.5 - 6X, while being highly robust to network errors with minimal loss in accuracy.

Cite

CITATION STYLE

APA

Chandrasekaran, R., Ergun, K., Lee, J., Nanjunda, D., Kang, J., & Rosing, T. (2022). FHDnn: Communication Efficient and Robust Federated Learning for AIoT Networks. In Proceedings - Design Automation Conference (pp. 37–42). Institute of Electrical and Electronics Engineers Inc. https://doi.org/10.1145/3489517.3530394

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free