SALSTM: Segmented Self-Attention LSTM for Long-Term Forecasting

Read the full article See related articles

Listed in

This article is not in any list yet, why not save it to one of your lists.
Log in to save this article

Abstract

Time series forecasting plays a crucial role in various fields such as financial market analysis, weather prediction, and traffic flow forecasting. Although LSTM performs well in traditional time series forecasting tasks, its performance significantly deteriorates as the prediction sequence length increases. When the prediction length reaches 96, the results are almost irrelevant to the actual task. Long Short-Term Memory (LSTM) networks often struggle with performance degradation as the prediction sequence length increases in long-term time series forecasting (LTSF). To address this issue, we propose an innovative time series forecasting model called SALSTM, which combines segmented iteration and intra-segment self-attention mechanisms to tackle the performance degradation of traditional LSTM in long-term time series forecasting. By reducing the number of recursive iterations and enhancing the ability to capture long-distance dependencies, SALSTM significantly improves prediction accuracy and computational efficiency. Experimental results show that SALSTM outperforms traditional LSTM and current state-of-the-art (SOTA) Transformer models across multiple benchmark datasets. The research indicates that LSTM variants still hold potential in long-term time series forecasting, and thoughtful design can further enhance their effectiveness.

Article activity feed