Evaluating Lightweight Neural Models for Edge-Based Anomaly Detection: Performance and Efficiency Trade-offs
Listed in
This article is not in any list yet, why not save it to one of your lists.Abstract
In edge computing scenarios where constraints on memory, latency, and energy hinder the utilization of large-scale models, lightweight neural networks are increasingly favored for anomaly detection. However, uniform benchmarks for comparing commonly utilized lightweight models under these constraints remain absent. This research addresses the gap by evaluating three prominent lightweight neural architectures, pruned convolutional neural networks (CNNs), quantized long short-term memory networks (LSTMs), and distilled transformers, across two established intrusion detection datasets: CIC-IoT-DIAD 2024 and TON_IoT (TON_IoT_Modbus and TON_IoT_Thermostat). We evaluate each model utilizing standard detection measures (accuracy, precision, recall, and F1-score) and deployment metrics (model size, inference latency, and memory consumption) under simulated edge constraints. Our findings indicate significant trade-offs between model accuracy and efficiency, with performance varying based on the dataset utilized. Certain models perform more effectively with flow-based data compared to others with IoT telemetry. No single model excelled in all evaluation criteria. This research provides future edge-optimized anomaly detection studies with a reliable, reproducible foundation and valuable insights on selecting models for real-time edge deployment. The results also guide our attention in creating our forthcoming architecture, S3LiteNet, intended to enhance performance and deployment in information-centric networks.