Enhancing long-term forex market forecasting using transformer-based time series models: A comparative analysis with ensemble tree methods

Read the full article See related articles

Discuss this preprint

Start a discussion What are Sciety discussions?

Listed in

This article is not in any list yet, why not save it to one of your lists.
Log in to save this article

Abstract

Accurate long-term forecasting in the foreign exchange (Forex) market remains a challenging task due to the inherent volatility, nonlinearity, and periodic trends of currency data. This study proposes a novel forecasting framework that integrates state-of-the-art Transformer-based time series models—Informer, Autoformer, FEDformer, and PatchTST—to capture complex temporal patterns and improve predictive performance. Utilizing enriched historical data from Yahoo Finance with engineered technical indicators (RSI, ATR, SMA, slope features), we benchmark these models against classical ensemble learning approaches including random forest, gradient boosting machines (GBM), and XGBoost. Experimental evaluations demonstrate that Transformer models, particularly FEDformer and PatchTST, outperform ensemble baselines in multi-step forecasting horizons, reducing MAPE and RMSE by significant margins. Furthermore, attention-based mechanisms enhance model interpretability while offering superior generalization to volatility shifts. This study highlights the potential of deep Transformer architectures in advancing financial time series forecasting, offering robust decision-making tools for traders and market analysts.

Article activity feed