Exploit Spatially Resolved Transcriptomic Data to Infer Cellular Features from Pathology Imaging Data

Read the full article See related articles

Listed in

This article is not in any list yet, why not save it to one of your lists.
Log in to save this article

Abstract

Digital pathology is a rapidly advancing field where deep learning methods can be employed to extract meaningful imaging features. However, the efficacy of training deep learning models is often hindered by the scarcity of annotated pathology images, particularly images with detailed annotations for small image patches or tiles. To overcome this challenge, we propose an innovative approach that leverages paired spatially resolved transcriptomic data to annotate pathology images. We demonstrate the feasibility of this approach and introduce a novel transfer-learning neural network model, STpath (Spatial Transcriptomics and pathology images), designed to predict cell type proportions or classify tumor microenvironments. Our findings reveal that the features from pre-trained deep learning models are associated with cell type identities in pathology image patches. Evaluating STpath using three distinct breast cancer datasets, we observe its promising performance despite the limited training data. STpath excels in samples with variable cell type proportions and high-resolution pathology images. As the influx of spatially resolved transcriptomic data continues, we anticipate ongoing updates to STpath, evolving it into an invaluable AI tool for assisting pathologists in various diagnostic tasks.

Article activity feed