A Communication-Efficient Federated Text Classification Method Based on Parameter Pruning

7Citations
Citations of this article
5Readers
Mendeley users who have this article in their library.

Abstract

Text classification is an important application of machine learning. This paper proposes a communication-efficient federated text classification method based on parameter pruning. In the federated learning architecture, the data distribution of different participants is not independent and identically distributed; a federated word embedding model FedW2V is proposed. Then the TextCNN model is extended to the federated architecture. To reduce the communication cost of the federated TextCNN model, a parameter pruning algorithm called FedInitPrune is proposed, which reduces the amount of communication data both in the uplink and downlink during the parameter transmission phase. The algorithms are tested on real-world datasets. The experimental results show that when the text classification model accuracy reduces by less than 2%, the amount of federated learning communication parameters can be reduced by 74.26%.

Cite

CITATION STYLE

APA

Huo, Z., Fan, Y., & Huang, Y. (2023). A Communication-Efficient Federated Text Classification Method Based on Parameter Pruning. Mathematics, 11(13). https://doi.org/10.3390/math11132804

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free