Federated learning enables the joint training of machine learning models by utilizing distributed data and computing resources while protecting local data privacy. The existing asynchronous federated learning can effectively solve the problems such as waste of computing resources and low training efficiency caused by synchronous learning. However, it aggregates local models from different nodes and updates the global model through the central server, which makes it endogenously subject to the centralized trust mode and suffers from some issues such as single point of failure and privacy leakage. In this paper, we propose a blockchain-based privacy-preserving asynchronous federated learning, which ensures the trustability by storing local models into the blockchain and generating the global model through the consensus algorithm. In order to guarantee the privacy of federated learning and improve the model utility, the exponential mechanism of differential privacy is used to select model gradients with high contribution at high probability, and a lower privacy budget is allocated to ensure the model privacy. In addition, in order to solve the problem of clock desynchronization in asynchronous federated learning, we propose a two-factor adjustment mechanism to further improve the global model utility. Finally, theoretical analysis and experimental results demonstrate that our proposed scheme can effectively guarantee the trustability and privacy of the asynchronous federated learning while improving the model utility.
Mendeley helps you to discover research relevant for your work.
CITATION STYLE
Gao, S., Yuan, L., Zhu, J., Ma, X., Zhang, R., & Ma, J. (2021). A blockchain-based privacy-preserving asynchronous federated learning. Scientia Sinica Informationis, 51(10), 1755–1774. https://doi.org/10.1360/SSI-2021-0087