Temporal-Spatial Deep Learning for Memory Usage Forecasting in Cloud Servers
Listed in
This article is not in any list yet, why not save it to one of your lists.Abstract
To address the challenge of highly volatile and difficult-to-predict memory usage in cloud servers, this paper proposes a memory usage prediction model that integrates Convolutional Neural Networks (CNN) and Long Short-Term Memory networks (LSTM). The approach extracts local spatial correlations from input feature sequences through the CNN module and captures temporal dependencies using the LSTM structure. This enables high-precision prediction of memory usage trends over time. To validate the model's effectiveness, a prediction dataset was constructed using real-world cloud server monitoring data, covering ten key resource indicators. Comparative experiments were conducted with several mainstream deep learning models. The results show that the proposed CNN+LSTM model outperforms traditional models in terms of MSE, MAE, and R2 metrics, demonstrating stronger fitting capability and greater stability. Loss convergence analysis and prediction curve comparisons further confirm that the model effectively captures the actual fluctuation patterns of resource usage. It performs particularly well on complex nonlinear sequences, exhibiting both strong predictive performance and practical engineering value.