Effective Activation Functions for Homomorphic Evaluation of Deep Neural Networks

44Citations
Citations of this article
36Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

CryptoNets and subsequent work have demonstrated the capability of homomorphic encryption (HE) in the applications of private artificial intelligence (AI). In convolutional neural networks (CNNs), many computations are linear functions such as the convolution layer which can be homomorphically evaluated. However, there are layers such as the activation layer which is comprised of non-linear functions that cannot be homomorphically evaluated. One of the most commonly used methods is approximating these non-linear functions using low-degree polynomials. However, using the approximated polynomials as activation functions introduces errors which could have a significant impact on accuracy in classification tasks. In this paper, we present a systematic method to construct HE-friendly activation functions for CNNs. We first determine what properties in a good activation function contribute to performance by analyzing commonly used functions such as Rectified Linear Units (ReLU) and Sigmoid. Then, we compare polynomial approximation methods and search for an optimal range of approximation for the polynomial activation. We also propose a novel weighted polynomial approximation method tailored to the output distribution of a batch normalization layer. Finally, we demonstrate the effectiveness of our method using several datasets such as MNIST, FMNIST, CIFAR-10.

Cite

CITATION STYLE

APA

Obla, S., Gong, X., Aloufi, A., Hu, P., & Takabi, D. (2020). Effective Activation Functions for Homomorphic Evaluation of Deep Neural Networks. IEEE Access, 8, 153098–153112. https://doi.org/10.1109/ACCESS.2020.3017436

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free