Deep Temporal Graph Learning for Cascade Popularity Prediction in Social Networks

Read the full article See related articles

Listed in

This article is not in any list yet, why not save it to one of your lists.
Log in to save this article

Abstract

Information propagation in social networks often exhibits cascade behavior, where initial posts trigger chain reactions of shares and interactions. Accurately predicting cascade popularity is crucial for applications such as misinformation control and viral marketing. In this paper, we propose a hybrid architecture that combines Graph Neural Networks (GNNs) with Transformer encoders to capture both local network topology and long-range temporal dependencies in cascade evolution. Our model employs a Self-Attention Transformer layer to model temporal dynamics and processes graph-structured cascade data through message passing mechanisms (GCN/GAT). We evaluate our approach on two real-world datasets: Twitter and Digg, which provide rich temporal and structural information for cascade analysis. The prediction target is cascade popularity at multiple time horizons (6, 12, 18, and 24 hours), assessed using rank correlation (Spearman) and regression metrics (MSE, MAE, R²). Compared to established baselines including GCN, GAT, and traditional node embedding methods (DeepWalk and Node2Vec with MLP), our GNN + Transformer model consistently achieves higher correlation and lower prediction error across all evaluation metrics. The performance improvements are particularly pronounced for long-term forecasts; for instance, our model significantly reduces MSE at the 24-hour prediction horizon compared to all baseline methods. These results demonstrate that the integrated Transformer component effectively captures long-range cascade dynamics that are not adequately modeled by conventional graph-based approaches.

Article activity feed