A rapid scoping review of evaluability assessment frameworks for health and care service delivery innovations

Read the full article See related articles

Listed in

This article is not in any list yet, why not save it to one of your lists.
Log in to save this article

Abstract

Background With resources scarce, health and care decision-makers have to allocate resources efficiently to evaluate innovations that may likely be ready for evaluation, thus implementing and scaling up innovations that actually achieve impacts. There is a lack of understanding of guidance on how an evaluation should be pre-planned before it can proceed. Our aim was to scope published literature on the evaluability assessment frameworks on whether and how an innovation evaluation should be planned. Methods For this rapid scoping review (registration number: https://osf.io/8x2n4), we searched Ovid MEDLINE, Ovid Embase and additional resources in October 2024 to identify English language publications that reported evaluability assessment frameworks to support the planning of evaluations of innovations. We screened studies and extracted and coded data of the identified frameworks and 20 components we pre-specified, following the rapid approaches of scoping review methodology. We synthesised and presented evidence narratively and in tabular formats as required. Results We included 36 publications that reported 24 unique evaluability assessment frameworks. Only 40% (8/20) of frameworks were developed in the health and care fields. Only four frameworks (16.7%) were reported following well-acknowledged development approaches such as the Moher’s guidance development methodology. None of the 24 frameworks could result in an evaluation plan, following the assessment exercise, that users can use to carry out the expected evaluations in real-world settings. No framework was considered comprehensive in terms of the inclusion of the 20 pre-specified components, with notable omissions around health equity (not reported in any framework), and research governance (reported in 8.3%, 2/24 frameworks). Conclusions Despite the availability of several frameworks, no framework is comprehensive in using important components and instructs users to produce an actual evaluation plan following assessment exercises. It may be necessary to develop such a tool to facilitate the planning of an innovation evaluation before a formal summative evaluation. Trial registration Open Science Framework registration number https://osf.io/8x2n4

Article activity feed