Quantum-Enhanced Multimodal Prognostic Transformer for Skin Disease Progression Prediction and Visualization

Read the full article See related articles

Listed in

This article is not in any list yet, why not save it to one of your lists.
Log in to save this article

Abstract

Accurate classification and staging of skin diseases such as monkeypox, chickenpox, and measles are critical for timely clinical intervention, particularly in resource-limited settings. We present a proof-of-concept Quantum-Enhanced Multimodal Prognostic Transformer (Q-MPT) that integrates dermoscopic images with patient metadata, including age and lesion location, to predict disease type and progression stage jointly. The architecture combines a Vision Transformer backbone with a metadata fusion pathway and a lightweight quantum layer designed to enhance feature representation. To approximate disease evolution, we employ a latent trajectory predictor based on long short-term memory modeling and a quantum-inspired generative module that simulates counterfactual lesion appearances under different progression scenarios. Explainability is achieved through attention rollouts, Integrated Gradients for metadata attribution, and latent space visualization using variational autoencoders. On a custom-labeled dataset with synthetically derived stage labels, Q-MPT achieves 89.4\% accuracy for disease classification and 87.3\% for stage prediction, outperforming conventional convolutional neural networks and Vision Transformer baselines. While these results highlight the potential of integrating quantum-inspired computation with multimodal learning for dermatology, limitations include reliance on simulated metadata and the absence of validation on publicly available benchmarks. The findings establish Q-MPT as an early-stage framework that bridges diagnostic and prognostic modeling, providing a foundation for future clinically validated, explainable AI systems in dermatology.

Article activity feed