Distilling clinical sensitivity: Explainable and lightweight knee osteoarthritis diagnosis via decoupled knowledge distillation

Read the full article See related articles

Discuss this preprint

Start a discussion What are Sciety discussions?

Listed in

This article is not in any list yet, why not save it to one of your lists.
Log in to save this article

Abstract

Knee Osteoarthritis (KOA) is a leading cause of global disability, necessitating early and accurate diagnosis to prevent irreversible joint degeneration. While deep learning has automated radiographic assessment, current state-of-the-art models incur massive computational costs, hindering deployment in resource-constrained clinical settings. To bridge this gap, we propose a high-efficiency framework utilizing Decoupled Knowledge Distillation (DKD). We employ a Swin Transformer teacher to guide a lightweight GhostNetV2 student, decoupling the distillation loss into Target Class (TCKD) and Non-Target Class (NCKD) components. The method achieves a binary classification accuracy of 87.02% (95% CI: 0.870 ± 0.016) on the Osteoarthritis Initiative (OAI) dataset. While overall accuracy is statistically comparable to a non-distilled baseline (P > 0.05),the DKD framework yields a statistically significant improvement in diagnostic sensitivity (0.825 vs. 0.767; non-overlapping 95% CI), critically reducing false negatives. The model matches the heavy teacher’s performance while being approximately 6 times smaller and 5 times faster. Validated via LayerCAM, the model demonstrates precise localization of osteophytes consistent with radiological interpretations, presenting a scalable, clinically safe solution for automated KOA screening.

Article activity feed