SHAP-Driven Ensemble Learning for Explainable COVID-19 Detection

Read the full article See related articles

Listed in

This article is not in any list yet, why not save it to one of your lists.
Log in to save this article

Abstract

The COVID-19 pandemic had a disastrous impact on the global community and challenged healthcare systems worldwide. The progress of the disease also has disastrous effects on the heart, kidney, and liver of an individual. The only way to save mankind from this minacious disease is its accurate and explainable detection. In this study, we propose a SHAP-driven ensemble learning framework using an optimized XGBoost algorithm ensemble with Random Forest for COVID-19 detection while providing interpretable insights into model predictions. We employ different ML models on the same dataset to evaluate the performance of these ML techniques with the proposed framework. The study also integrates Shapley additive explanations (SHAP) to improve feature selection, model interpretability, and prediction reliability. The proposed framework identifies key biomarkers that have a significant impact on COVID-19 thus improving transparency and trust in AI applications. Experimental results showed that the proposed ensemble model outperformed all other models and achieved an accuracy of 96.7%. The integration of SHAP not only enhances the model’s performance but also helps in understanding the critical factors that have a significant impact on the detection of COVID-19.

Article activity feed