GRNFormer: Accurate Gene Regulatory Network Inference Using Graph Transformer

Read the full article See related articles

Discuss this preprint

Start a discussion What are Sciety discussions?

Listed in

This article is not in any list yet, why not save it to one of your lists.
Log in to save this article

Abstract

We introduce GRNFormer, a generalizable graph transformer framework for accurate gene regulatory network (GRN) inference from transcriptomics data. Designed to work across species, cell types, and platforms without requiring cell-type annotations or prior regulatory information, GRNFormer integrates a transformer-based Gene Transcoder with a variational graph autoencoder GraViTAE combined with pairwise attention to learn the representations of GRNs. Leveraging TF-Walker, a transcription factor-anchored subgraph sampling strategy, it effectively captures gene regulatory interactions from single-cell or bulk RNA-seq data. Evaluated on standard benchmark datasets, GRNFormer outperforms existing traditional and deep learning state-of-the-art methods in blind evaluations, achieving 0.90-0.98 average Area Under the Receiver Operating characteristic Curve (AUROC) and Area Under the Precision-Recall Curve (AUPRC) as well as 0.87-0.98 average F1 score. It robustly recovers both known and novel regulatory networks, including pluripotency circuits in human embryonic stem cells (hESCs) and immune cell modules in Peripheral Blood Mononuclear Cells (PBMCs). Its architecture enables scalable, biologically interpretable GRN inference across various datasets, cell types, and species, establishing GRNFormer as a robust and transferable tool for network biology.

Article activity feed