FedPeer: A Peer-to-Peer Learning Framework Using Federated Learning

3Citations
Citations of this article
4Readers
Mendeley users who have this article in their library.
Get full text

Abstract

The computing power of personal devices has increased in the recent past, and the large amount of data is being generated by these devices. Owning to security concerns, research is currently done on training machine learning models locally on these devices. In order to train a global model, two different approaches have been proposed in the recent past. One, in which the models trained on the devices are aggregated at a central location, and the aggregated model is transmitted back to the devices and the other, where devices communicate the models among themselves without the need of a central server. Both these approaches have their own share of advantages and short comings. In this paper, we propose FedPeer a new decentralized machine learning approach where the nodes are clustered based on confidence level of predicting the data. The higher confident nodes form a federated cluster, whereas the remaining nodes participate in a peer-to-peer decentralized learning along with a representative from the federated cluster. Our experiments show that this approach is proved to have a faster convergence rate and lower communication overhead when compared to the either a federated approach or a complete peer-to-peer method.

Cite

CITATION STYLE

APA

Kasturi, A., Sivaraju, R., & Hota, C. (2022). FedPeer: A Peer-to-Peer Learning Framework Using Federated Learning. In Lecture Notes in Electrical Engineering (Vol. 869, pp. 517–525). Springer Science and Business Media Deutschland GmbH. https://doi.org/10.1007/978-981-19-0019-8_39

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free