A Hybrid CNN-RNN Model Changes the Game for Side-Channel Attack Prediction

Read the full article See related articles

Listed in

This article is not in any list yet, why not save it to one of your lists.
Log in to save this article

Abstract

A common type of cyberattacks is the Side Channel Attack (SCA), which affects many devices and equipment connected to a network. These attacks have different types, such as power attacks like Differential Power Analysis (DPA), electromagnetic attacks, storage attacks, and others. Researchers and cyber security experts are concerned about SCA targeting devices, which can lead to losing and stealing important information. Many researchers have tried to reduce the impact of this attack. Deep Learning (DL) based solutions have shown significant promise in detecting SCAs due to their ability to identify complex patterns and anomalies in large datasets. Convolutional Neural Networks (CNN), and Recurrent Neural Networks (RNN), including Long Short-Term Memory (LSTM) networks, are effective for analyzing time-series data and identifying temporal dependencies in side channel data. Despite the DL models having been successfully applied to detect power analysis, electromagnetic, and timing attacks, they still face data collection, model generalization, adversarial, and robustness challenges. This study aims to enhance side-channel attack prediction using deep-learning algorithms by investigating the use of the DL models in predicted SCA that focus on DPA, the DPA contest v4.2 dataset was used to predict power attacks, it contains 3253 features and 80,000 traces. Several data preprocessing techniques, including Hamming and Hanning windows, and feature selection using SelectKBest, were used to enhance model performance. The efficacy of CNN, RNN, and a hybrid CNN-RNN model was explored. The study results indicated that the hybrid CNN-RNN model achieved accuracy (99.6\%), loss (0.0270), and executed time (15,240.42 seconds). This study underscores the importance of optimal hyperparameter tuning, selecting appropriate activation functions (SeLU) and activation (Adam optimizer), in improving DL model performance.

Article activity feed