TGCformer: A Transformer-Based Dual-Channel Fusion Framework for Power Load Anomaly Detection

Read the full article See related articles

Discuss this preprint

Start a discussion What are Sciety discussions?

Listed in

This article is not in any list yet, why not save it to one of your lists.
Log in to save this article

Abstract

Existing methods for power load anomaly detection suffer from several limitations, including insufficient extraction of multi-scale temporal features, difficulty in capturing long-range dependencies, and inefficient fusion of heterogeneous Time-Graph information. To address these issues, this study proposes the TGCformer, an enhanced framework for Time-Graph feature fusion. First, a dual-channel feature extraction module is constructed. The temporal path utilizes Time Series Feature Extraction based on Scalable Hypothesis Tests (TSFresh) to enhance the explicit pattern representation of the load sequences, while the graph-learning path employs a Sparse Unified Graph Attention Network v2 (Sparse Unified GATv2) to model global semantic correlations among time steps. Together, these two paths provide more interpretable and structured inputs for the subsequent fusion module. Subsequently, a multi-head cross-attention mechanism is designed, where temporal features serve as the Query and graph-level embeddings as the Key and Value to guide the feature fusion process. This design ensures the effective integration of complementary information while suppressing noise. Experimental results on the public Irish CER Smart Meter Dataset demonstrate the effectiveness of the proposed model. Specifically, TGCformer consistently outperforms four classic deep learning baselines (XceptionTime, InceptionTime, FormerTime, and LSTM-GNN), demonstrating competitive detection accuracy and robustness.

Article activity feed