Patient and staff acceptability of an artificial intelligence software to prioritize chest X-rays in an NHS setting: A qualitative evaluation

Read the full article See related articles

Discuss this preprint

Start a discussion What are Sciety discussions?

Listed in

This article is not in any list yet, why not save it to one of your lists.
Log in to save this article

Abstract

Background Plain-film chest X-rays (CXRs) remain the first-line investigation in the lung cancer (LC) diagnostic pathway; however, the time from the original referral for CXR to computed tomography (CT) and respiratory review can currently take several weeks for outpatients. An artificial intelligence (AI) software, which can be utilized to speed up the detection of LC by flagging abnormalities in CXRs for priority reporting, has the potential to enable outpatient CT imaging within 72 hours of CXR and reduce the time from referral to reported CT by several weeks. A longitudinal qualitative evaluation was designed to provide an in-depth understanding of the acceptability and behavioral impact of the software from the perspectives of both NHS staff and service users. Methods Repeat, semistructured interviews were conducted with 16 NHS staff (radiologists, radiographers, and administrators) before and after implementation of the software in a Scottish NHS board. Eight NHS service users participated in focus groups to explore their experiences and attitudes toward the software. The Theoretical Framework of Acceptability informed data collection; however, the data analysis employed an inductive thematic approach. Results Staff viewed the software as a useful adjunct and believed it could expedite LC pathways; service users perceived the potential for better care through earlier diagnosis and an accelerated pathway to treatment. The radiology staff raised concerns about the software’s accuracy and interpretations, as well as the potential for complacency and biased decision-making. Staff perceived that the change in the patient pathway performed better than anticipated, and service users voiced a preference for short-term anxiety with prompt action (AI pathway) to prolonged uncertainty (standard pathway). Conclusions This study identified key facilitators of and barriers to the acceptability of AI software for prioritizing CXRs. There is support for AI-assisted CXR assessment, provided that it improves patient outcomes and is used in conjunction with clinicians. A hierarchy of confidence in the software's performance significantly impacted the experienced acceptability of the technology. To support stable, long-term adoption, AI software must integrate easily into existing systems, perform accurately, and not disrupt practice or negatively impact patient pathways.

Article activity feed