Machine Learning–Based Cryptographic Selection System for IoT Devices Based on Computational Resources

Read the full article See related articles

Discuss this preprint

Start a discussion What are Sciety discussions?

Listed in

This article is not in any list yet, why not save it to one of your lists.
Log in to save this article

Abstract

The rapid expansion of the Internet of Things (IoT) increases the need for secure, efficient data protection in resource-constrained devices. Choosing cryptographic algorithms is challenging. Conventional methods degrade performance; lightweight algorithms may lack security. This research introduces a supervised machine learning system that autonomously recommends optimal encryption methods based on device capabilities. A large-scale data-generation framework simulated 16 symmetric and hybrid cryptographic algorithms on varied IoT devices. Algorithm suitability was scored using weighted metrics: execution time, RAM and ROM use, and battery consumption. This dataset was used to train machine learning models; Random Forest performed best. Further optimization used Sequential Backward Selection (SBS), Synthetic Minority Oversampling Technique (SMOTE), and hyperparameter tuning. The resulting model achieved near-perfect accuracy, F1 score, Matthews correlation coefficient (MCC), and area under the receiver operating characteristic curve (AUC-ROC). The optimized model delivers reliable, context-aware cryptographic recommendations without manual evaluation, helping IoT designers deploy resource-efficient security. The study is limited by simulation-based measurements and by the exclusion of post-quantum algorithms. However, it provides a scalable basis for intelligent selection of cryptographic algorithms in future IoT environments.

Article activity feed