Deep Learning Technology for Classification of Thyroid Nodules Using Multi-view Ultrasound Images: Potential Benefits and Challenges in Clinical Application

Read the full article

Listed in

This article is not in any list yet, why not save it to one of your lists.
Log in to save this article

Abstract

Background: This study aimed to evaluate the applicability of deep learning technology to thyroid ultrasound images for classification of thyroid nodules. Methods: This retrospective analysis included ultrasound images of patients with thyroid nodules investigated by fine-needle aspiration (FNA) at the thyroid clinic of a single center from April 2010 to September 2012. Thyroid nodules with cytopathologic results of Bethesda category V (suspicious for malignancy) or VI (malignant) were defined as thyroid cancer. Multiple deep learning algorithms based on convolutional neural networks (CNNs) – ResNet, DenseNet, and EfficientNet – were utilized, and Siamese neural networks facilitated multi-view analysis of paired transverse and longitudinal ultrasound images. Results: Among 1,048 analyzed thyroid nodules from 943 patients, 306 (29%) were identified as thyroid cancer. In a subgroup analysis of transverse and longitudinal images, longitudinal images showed superior prediction ability. Multi-view modeling, based on paired transverse and longitudinal images, significantly improved the model performance; with an accuracy of 0.82 (95% confidence intervals [CI] 0.80-0.86) with ResNet50, 0.83 (95% CI, 0.83-0.88) with DenseNet201, and 0.81 (95% CI, 0.79-0.84) with EfficientNetv2_s. Training with high-resolution images obtained using the latest equipment tended to improve model performance in association with increased sensitivity. Conclusion: CNN algorithms applied to ultrasound images demonstrated substantial accuracy in thyroid nodule classification, indicating their potential as valuable tools for diagnosing thyroid cancer. However, in real-world clinical settings, it is important to aware that model performance may vary depending on the quality of images acquired by different physicians and imaging devices.

Article activity feed