An Efficient Prediction Based Dynamic Resource Allocation Framework in Quantum Cloud Using Knowledge Based Offline Reinforcement Learning

Read the full article See related articles

Listed in

This article is not in any list yet, why not save it to one of your lists.
Log in to save this article

Abstract

Quantum Cloud Computing (QCC) is a practice of setting up the cloud platform for delivering computing assets over the internet via a pay-as-you-go model with the help of Quantum Computing (QC) paradigm. Real-time applications have scrupulous compliance regarding performance requirements due to the low-speed of traditional computers. Estimating cloud data center asset usage is a challenging task due to its dynamic nature. It employs a contemporary model to precisely estimate data center CPU utilization and applies an effective resource controller for optimized resource allocation using quantum computers. The proposed design ensures efficient resource estimation, scaling up or down based on predictions. An efficient dynamic resource controller is crucial to solving the scaling process with quantum computing support. Existing systems use a Reinforcement-Based resource controller with a Markov decision process that decides based on the current state of the environment, leading to long scaling and processing times. Our proposed model, the Prediction-Based Offline Reinforcement Learning (PB-ORL) Model, enhances this by considering historical information for prediction-based decisions. This approach achieves accurate and high-performance prediction, optimizing resource allocation proactively and dynamically. The model is analyzed using a real cloud data set with quantum cloud and machine learning approaches, which reduces latency and bandwidth traffic. Empirical results show that the proposed quantum computer-based machine learning approach outperforms previous methods, achieving 30–50% improved accuracy in CPU resource utilization and reducing time complexity by 33–42% in resource allocation.

Article activity feed