Time Series Foundation Model for Improved Transformer Load Forecasting and Overload Detection
Listed in
This article is not in any list yet, why not save it to one of your lists.Abstract
Simple load forecasting and overload prediction models, such as LSTM and XGBoost, are unable to handle the increasing amount of data in power systems. Recently, various foundation models (FMs) for time series analysis have been proposed, which can be scaled up for large time series variables and datasets across domains. However, the simple pre-training setting makes FMs unsuitable for complex downstream tasks. Effectively handling real-world tasks depends on additional data, i.e., covariates, and prior knowledge. Incorporating these through structural modifications to FMs is not feasible, as it would disrupt the pre-trained weights. To address this issue, this paper proposes a frequency domain mixer, i.e., FreqMixer, framework for enhancing the task-specific analytical capabilities of FMs. FreqMixer is an auxiliary network for the backbone FMs that takes covariates as input. It has the same number of layers as the backbone and communicates with it at each layer, allowing the incorporation of prior knowledge without altering the backbone’s structure. Through experiments, FreqMixer demonstrates high efficiency and performance, reducing MAPE by 23.65%, recall by 87%, and precision by 72% in transformer load forecasting during the Spring Festival while improving precision by 192.09% and accuracy by 14% in corresponding overload prediction, all while processing data from over 160 transformers with just 1M additional parameters.