U-Net Inspired Transformer Architecture for Multivariate Time Series Synthesis

Read the full article See related articles

Discuss this preprint

Start a discussion What are Sciety discussions?

Listed in

This article is not in any list yet, why not save it to one of your lists.
Log in to save this article

Abstract

This study introduces a Multiscale Dual-Attention U-Net (TS-MSDA U-Net) model for long-term time series synthesis. By integrating multiscale temporal feature extraction and dual-attention mechanisms into the U-Net backbone, the model captures complex temporal dependencies more effectively. The model was evaluated in two distinct applications. In the first, using multivariate datasets from 70 real-world electric vehicle (EV) trips, TS-MSDA U-Net achieved a mean absolute error below 1% across key parameters, including battery state of charge, voltage, acceleration, and torque—representing a two-fold improvement over the baseline TS-p2pGAN. While dual-attention modules provided only modest gains over the basic U-Net, the multiscale design enhanced overall performance. In the second application, the model was used to reconstruct high-resolution signals from low-speed analog-to-digital converter data in a prototype resonant CLLC half-bridge converter. TS-MSDA U-Net successfully learned nonlinear mappings and improved signal resolution by a factor of 36, outperforming the basic U-Net, which failed to recover essential waveform details. These results underscore the effectiveness of transformer-inspired U-Net architectures for high-fidelity multivariate time series modeling in both EV analytics and power electronics.

Article activity feed