Automated Tumor and FUS Lesion Quantification on Multi-frequency Harmonic Motion and B-mode Imaging Using a Multi-modality Neural Network

Read the full article See related articles

Listed in

This article is not in any list yet, why not save it to one of your lists.
Log in to save this article

Abstract

Harmonic Motion Imaging (HMI) is an ultrasound elasticity imaging method that measures the mechanical properties of tissue using amplitude-modulated acoustic radiation force (AM-ARF). By estimating tissue’s on-axis oscillatory motion, HMI-derived displacement images represent localized relative stiffness and can predict the tumor response to neoadjuvant chemotherapy (NACT) and monitor focused ultrasound (FUS) ablation therapy. Multi-frequency HMI (MF-HMI) excites tissue at various AM frequencies simultaneously, which allows for image optimization without prior knowledge of inclusion size and stiffness. However, challenges remain in size estimation as inconsistent boundary effects result in different perceived sizes across AM frequencies. Herein, we developed an automated tumor and FUS lesion quantification method using a transformer-based multi-modality neural network, HMINet. It was trained on 380 pairs of MF-HMI and B-mode images of phantoms and in vivo orthotopic breast cancer mice (4T1). Test datasets included phantoms (n = 32), in vivo 4T1 mice (n = 24), breast cancer patients (n = 16), and a FUS-induced lesion, with average segmentation accuracy (Dice Similarity Score) of 0.95, 0.86, 0.82, and 0.87, respectively. To increase the generalizability of HMINet, we applied a transfer learning strategy, i.e., fine-tuning the model using patient data. For NACT patients, the displacement ratios (DR) between the tumor and surrounding tissue were calculated based on HMINet-segmented boundaries to predict tumor response based on stiffness changes.

Article activity feed