Self-Supervised Heterogeneous Graph Neural Network with Multi-Scale Meta-Path Contrastive Learning
Listed in
This article is not in any list yet, why not save it to one of your lists.Abstract
Heterogeneous graph neural networks (HGNNs) exhibit remarkable capabilities in modeling complex structures and multi-semantic information. However, existing methods mainly focus on capturing high-order association patterns between heterogeneous nodes when constructing meta-paths, while they often lack sufficient expressive power for local neighborhood information. This limitation hinders their ability to effectively model both global and local structural relationships. To address this issue, we propose a self-supervised heterogeneous graph neural network (HMMC) based on multi-scale meta-path contrastive learning. The proposed approach introduces a multi-scale meta-path embedding mechanism that jointly captures both local and global structural information. Additionally, we design a cross-view self-supervised contrastive learning framework to optimize representations across multiple views, thereby enhancing the model's capacity to represent heterogeneous graph topological structures. To effectively mitigate the negative sample noise that often interferes with model optimization in traditional contrastive learning methods, we propose a novel star-shaped contrastive loss. This loss function ensures the representational consistency of positive sample pairs by constructing a multi-level optimization strategy involving center nodes, positive samples, and negative samples. Experimental results show that the proposed method outperforms existing state-of-the-art approaches across multiple datasets, achieving performance improvements of 0.5–4.1%, thus fully validating its representational capacity, robustness, and generalizability in heterogeneous graph learning tasks.