An Immuno-Linguistic Transformer for Multi-Scale Modeling of T-Cell Spatiotemporal Dynamics
Discuss this preprint
Start a discussion What are Sciety discussions?Listed in
This article is not in any list yet, why not save it to one of your lists.Abstract
Understanding the spatiotemporal dynamics of T-cell clones is a critical challenge in immunology and immunotherapy, with direct implications for cancer treatment and vaccine design. While Large Language Models (LLMs) have demonstrated immense power in decoding complex sequential data, their application to the "language of immunity" remains nascent. Existing computational models often struggle to capture the hierarchical, multi-scale nature of immune responses and fail to model biologically plausible system perturbations. To bridge this gap, we propose the Immuno-Linguistic Spatiotemporal Transformer (ILST), a self-supervised framework inspired by LLM architectures. Our framework introduces two key innovations: a Biologically-Informed Perturbation (BIP) module that simulates systemic events (e.g., infection or therapy) by respecting the functional importance of key T-cell clones, and a Hierarchical Tissue-Scale Fusion (HTF) module that uses attention to dynamically weigh and combine representations from cellular, tissue, and systemic levels. We validate our model on several public graph datasets, which serve as effective proxies for complex biological networks with varying degrees of heterogeneity. ILST achieves consistently strong results in predicting node states. Notably, it significantly enhances performance on heterogeneous (disassortative) graphs, demonstrating its potential for robustly modeling T-cell dynamics in complex microenvironments like tumors.