Intelligent Intrusion Detection System Using NSOA and Hybrid ECA-LiteCBNet Model for Cyber Threat Mitigation

Read the full article See related articles

Discuss this preprint

Start a discussion What are Sciety discussions?

Listed in

This article is not in any list yet, why not save it to one of your lists.
Log in to save this article

Abstract

Cyber threat mitigation refers to the strategic implementation of technologies, policies, and practices to detect, prevent, and reply to malicious cyber activities. It involves reducing vulnerabilities, managing risks, and ensuring resilience against threats such as malware, phishing, ransomware, and denial-of-service attacks. Effective mitigation enhances the security posture of digital schemes and safeguards data integrity and confidentiality. In this paper, to present a novel intrusion detection system (IDS) integrates a biologically inspired Neural Synapse Optimization Algorithm (NSOA) for optimal feature selection with a hybrid deep learning classification model based on Effective Channel Attention with Lightweight Convolutional Neural Networks and Bidirectional Long Short-Term Memory (ECA-LiteCBNet). The proposed approach is evaluated using three widely recognized cybersecurity datasets: UNSW-NB15, NSL-KDD, and CSE-CIC-IDS2018. To ensure model reliability and fairness, robust preprocessing is performed, including missing value handling, duplicate removal, categorical encoding, normalization, and class balancing through SMOTE/ADASYN. NSOA simulates the synaptic learning mechanisms of biological neurons, dynamically optimizing feature subsets to enhance learning and reduce redundancy. The ECA-LiteCBNET model captures both spatial and temporal patterns, crucial for detecting complex attack sequences. Comparative analyses with six popular feature selection algorithms (GA, GWO, TSR, ACO, CRO, BWO) demonstrate the superiority of NSOA in identifying high-value features. Similarly, the proposed hybrid model outperforms classical classifiers such as SVM, KNN, DNN, Autoencoder, XGBoost, and deep networks including LSTM, RNN, and 1D-CNN. The system achieves top-tier performance across all datasets, with average detection accuracy exceeding 98.5% and AUC scores above 0.995. Visualization through ROC curves and training-validation accuracy/loss curves confirms model stability and convergence. This study highlights the potential of neuro-inspired optimization for cybersecurity applications besides sets the stage for real-time, scalable threat detection frameworks.

Article activity feed