GELU-net: A globally encrypted, locally unencrypted deep neural network for privacy-preserved learning

71Citations
Citations of this article
54Readers
Mendeley users who have this article in their library.

Abstract

Privacy is a fundamental challenge for a variety of smart applications that depend on data aggregation and collaborative learning across different entities. In this paper, we propose a novel privacy-preserved architecture where clients can collaboratively train a deep model while preserving the privacy of each client's data. Our main strategy is to carefully partition a deep neural network to two non-colluding parties. One party performs linear computations on encrypted data utilizing a less complex homomorphic cryptosystem, while the other executes non-polynomial computations in plaintext but in a privacy-preserved manner. We analyze security and compare the communication and computation complexity with the existing approaches. Our extensive experiments on different datasets demonstrate not only stable training without accuracy loss, but also 14 to 35 times speedup compared to the state-ofthe-art system.

Cite

CITATION STYLE

APA

Zhang, Q., Wang, C., Wu, H., Xin, C., & Phuong, T. V. (2018). GELU-net: A globally encrypted, locally unencrypted deep neural network for privacy-preserved learning. In IJCAI International Joint Conference on Artificial Intelligence (Vol. 2018-July, pp. 3933–3939). International Joint Conferences on Artificial Intelligence. https://doi.org/10.24963/ijcai.2018/547

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free