A Novel Transformer-Based Framework for Nonlinear Time-Series Prediction of Library Borrowing Volumes
Discuss this preprint
Start a discussion What are Sciety discussions?Listed in
This article is not in any list yet, why not save it to one of your lists.Abstract
Library book circulation is a key metric reflecting the utilization of library collections and informing management decisions. However, forecasting daily borrowing volumes is challenging due to complex nonlinear temporal patterns in the data. In this work, we propose a Transformer-based model for library borrowing volume prediction. By leveraging the multi-head self-attention mechanism and a stacked encoder architecture, our approach captures long-range dependencies in the borrowing time series more effectively than traditional methods. We train the model on several years of daily borrowing records and evaluate its performance against baseline models including Gated Recurrent Units (GRU), Long Short-Term Memory networks (LSTM), and Support Vector Regression (SVR). Experimental results show that under optimal hyperparameters, the Transformer model achieves a significantly lower Root Mean Square Error (RMSE) and Mean Absolute Error (MAE) – reduced by 16.2% and 23.2% respectively compared to the best-performing LSTM. The Transformer also adapts better to dynamic changes in borrowing patterns, yielding improved prediction accuracy. These findings demonstrate the potential of Transformer-based techniques in capturing complex temporal dynamics for library circulation forecasting.