Survival Prediction for Bladder Cancer Using Multimodal Data With Quantum Neural Networks and Transformer Architectures
Listed in
This article is not in any list yet, why not save it to one of your lists.Abstract
Background: To address the challenges of cross-modal information fusion in high-dimensional multimodal medical data for cancer prognosis, this study presents a hybrid diagnostic accuracy model for cancer survival prediction, integrating quantum computing with classical deep learning in a retrospective analysis of bladder cancer patients. Methods: We propose QTMPN (Quantum-Transformer Multimodal Prognostic Network), a novel framework integrating quantum neural networks (QNNs), Transformers, and graph neural networks (GNNs). For high-dimensional whole-slide pathological images (WSIs), a quantum feature extractor (QFE) is designed using parallel quantum encoding and a hybrid quantum network to capture long-range dependencies. Multimodal data—including clinical and image features—are fused via a Transformer-GNN Collaborative Fusion (TCF) module employing attention-guided dynamic graphs. Results: Evaluated on the TCGA-BLCA dataset, QTMPN attained a survival prediction accuracy of 76.1% , outperforming baseline models such as PARADIGM and CMTA (up to 70.0%). This improvement suggests the model’s enhanced capability to capture cross-modal prognostic features. Further ablation experiment validated the effectiveness of the hybrid QNNs feature extract part (QFE) in QTMPN. Conclusions: QTMPN presents a promising quantum-classical framework for survival risk prediction in bladder cancer, effectively modeling complex multimodal interactions. The approach contributes to improving prognostic accuracy in oncology and supporting precision medicine.