Closing the Lung Cancer Screening Gap in FQHCs with AI-Powered Clinical Decision Support
Listed in
This article is not in any list yet, why not save it to one of your lists.Abstract
Background
Lung cancer remains the leading cause of cancer-related mortality in the United States, with screening adherence rates below 16% nationally and even lower among underserved populations [1–4]. Federally Qualified Health Centers (FQHCs), which serve over 30 million patients annually, face significant challenges in identifying screening-eligible individuals due to workforce constraints, manual workflows, and limited clinical decision support (CDS) infrastructure [5–7].
Objective
We developed and evaluated an AI-driven, natural language (NLP)-enhanced CDS platform designed to automate lung cancer screening eligibility determination and integrate seamlessly into the workflows of FQHCs.
Methods
The platform combines a deterministic rules engine, encoding the U.S. Preventive Services Task Force (USPSTF) 2021 screening criteria, with an NLP pipeline built using SciSpacy and MedSpaCy to extract smoking history details from unstructured clinical notes. Structured and unstructured data were unified using FHIR-compliant APIs to generate actionable screening flags within simulated FQHC workflows.
Results
In a synthetic cohort modeled on FQHC populations, the platform achieved precision, recall, and F1-scores above 0.90 for eligibility determination and reduced manual chart review workload by 62%. It generated clear, auditable recommendations aligned with Centers for Medicare & Medicaid Services (CMS) coverage policies and Healthcare Effectiveness Data and Information Set (HEDIS) quality measures, while identifying additional high-risk patients who may have been missed in manual reviews.
Conclusion
This work demonstrates the feasibility of an AI-powered CDS framework for improving lung cancer screening adherence in resource-limited primary care settings. Future development will focus on prospective validation, expansion to other preventive screenings and population health use cases, and integration of explainable AI to enhance clinical trust and scalability.