SuperHyperGraph Attention Networks
Listed in
This article is not in any list yet, why not save it to one of your lists.Abstract
Graph Attention Networks (GAT) employ self-attention to aggregate neighboring node features in graphs, effectively capturing structural dependencies. HyperGraph Attention Networks (HGAT) extend this mechanism to hypergraphs by alternating attention-based vertex-to-hyperedge and hyperedge-to-vertex updates, modeling higher-order relationships. In this work, we introduce the 𝑛-SuperHyperGraph Attention Network, which leverages SuperHyperGraphs—a hierarchical generalization of hypergraphs—to perform multi-tier attention among supervertices and superedges. Our investigation is purely theoretical; empirical validation via computational experiments is left for future study.