COLLAR: Combating Low-rank Temporal Latent Representation for High-dimensional Multivariate Time Series Prediction using Dynamic Koopman Regularization
Listed in
This article is not in any list yet, why not save it to one of your lists.Abstract
Recent advancements in high-dimensional multivariate time series forecasting have yielded promising outcomes by effectively capturing temporal dynamics within a low-dimensional latent subspace. While Recurrent Neural Networks (RNNs) excel at learning temporal models in reduced-dimensional spaces, the simplistic structure of the decoder fails to capture the diversity present in the observed data. Conversely, introducing a more intricate encoder might exacerbate the risk of overfitting. In response to this susceptibility, we propose COLLAR – an innovative architecture that employs dynamic linearized regularization. COLLAR aims to reconcile the inherent nonlinear temporal structures present in low-dimensional subspaces with the linear evolution of multivariate time series. Specifically, we utilize the Auto-Encoder (AE) as the backbone, facilitating the factorization of multivariable time series into a low-dimensional subspace. We construct a nonlinear prediction model for latent sequences using Gated Recurrent Units (GRU). To learn linear dynamics, we integrate the data-driven Koopman operator with a modified AE architecture to generate linear dynamics regularization terms, thereby constraining the uncertainty in nonlinear prediction. The eigenfunctions derived from the Koopman operator provide intrinsic coordinates that globally linearize the dynamics, enabling the prediction of the next state to depend solely on the current state. The proposed approach is evaluated across five publicly available datasets and outperforms several competitive baselines. The code and models are available at https://github.com/Nsyyyy/COLLAR.