TCLformer: Enhancing Multi-Scale Time Series Forecasting with Temporal Decomposition and Sparse Convolutional Attention

Read the full article See related articles

Discuss this preprint

Start a discussion What are Sciety discussions?

Listed in

This article is not in any list yet, why not save it to one of your lists.
Log in to save this article

Abstract

Real-world multivariate time series (MTS) often contain complex multi-scale temporal dependencies, and accurately capturing their short-term and long-term patterns remains a challenging task. Although Transformer-based models have made significant progress in this field in recent years, they still suffer from high computational overhead and limited ability to model long-term dependencies. Convolutional-based models, while computationally efficient, struggle to capture global dependencies and tend to overlook periodic and trend information in time series. To address this, we propose a predictive model that combines multi-scale adaptive convolutions with sparse convolutional attention mechanisms, named TCLformer. The model first performs trend-seasonal decomposition on the input time series to extract key intertwined temporal patterns; then, it uses parallel multi-scale convolutional modules to adaptively capture local features at different time scales. Finally, it introduces a sparse attention module based on causal convolutions, to efficiently model long-term dependencies. Experimental results on six public datasets demonstrate that TCLformer significantly outperforms existing state-of-the-art methods in terms of prediction accuracy, validating its effectiveness and superiority in multi-variable time series prediction tasks.

Article activity feed