NeuroSCA: Evolving Activation Functions for Side-Channel Analysis

5Citations
Citations of this article
8Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

The choice of activation functions can significantly impact the performance of neural networks. Due to an ever-increasing number of new activation functions being proposed in the literature, selecting the appropriate activation function becomes even more difficult. Consequently, many researchers approach this problem from a different angle, in which instead of selecting an existing activation function, an appropriate activation function is evolved for the problem at hand. In this paper, we demonstrate that evolutionary algorithms can evolve new activation functions for side-channel analysis (SCA), outperforming ReLU and other activation functions commonly applied to that problem. More specifically, we use Genetic Programming to define and explore candidate activation functions (neuroevolution) in the form of mathematical expressions that are gradually improved. Experiments with the ASCAD database show that this approach is highly effective compared to results obtained with standard activation functions and that it can match the state-of-the-art results from the literature. More precisely, the obtained results for the ASCAD fixed key dataset demonstrate that the evolved activation functions can improve the current state-of-the-art by achieving a guessing entropy of 287 for the Hamming weight model and 115 for the Identity leakage model, compared to 447 and 120 obtained in the literature.

Cite

CITATION STYLE

APA

Knezevic, K., Fulir, J., Jakobovic, D., Picek, S., & Durasevic, M. (2023). NeuroSCA: Evolving Activation Functions for Side-Channel Analysis. IEEE Access, 11, 284–299. https://doi.org/10.1109/ACCESS.2022.3232064

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free