Implausible Effects of Psychological Interventions Meta-Epidemiological Study and Development of a Simple Flagging Tool
Discuss this preprint
Start a discussion What are Sciety discussions?Listed in
This article is not in any list yet, why not save it to one of your lists.Abstract
In meta-analyses of psychological interventions, trials occasionally report effects that appear implausibly large. While such results are unlikely to reflect genuine treatment effects, this is rarely verifiable, and studies are often retained in the meta-analytic evidence. Existing guidance allows for highly questionable results to be discarded in evidence syntheses, but offers little direction on how to identify them in practice. Consequently, it remains unclear to what extent suspicious evidence has biased effect estimates in psychological intervention research. In this study, we develop a simple flagging tool to detect such trials, based on the (1) compatibility of their effect size with low risk of bias evidence, (2) achieved power, and (3) methodological rigor. We also examine specific characteristics of studies flagged by this tool, and the impact of their exclusion on pooled estimates and heterogeneity. In total, 2,881 effect sizes from 1,246 randomized trials were included from twelve living databases of psychological interventions for mental health problems. Overall, 5.3% of all effects ( n =153 across 102 studies) were flagged. Reanalysis of 135 meta-analyses from a large-scale evaluation of psychological interventions showed that excluding flagged studies led to substantially lower effect estimates (reductions of up to 31.2%) and decreased between-study heterogeneity (up to 51.1%; indication-wide analyses). The flagging tool has been integrated into the open-source R package “metapsyTools”. We discuss potential explanations for the accumulation of improbable findings in the published literature, and how the application of our tool may strengthen quality control in meta-analytic research.
Key Points
Question
How can implausibly large effect sizes in psychological intervention trials be identified, and what impact do they have on meta-analytic evidence?
Findings
In this meta-epidemiological study, we developed a simple flagging tool based on effect size compatibility, statistical power, and methodological rigor. Applying it to 2,881 effect sizes from 12 living databases, we found that 5.3% of effects were flagged. Excluding flagged studies led to substantially lower pooled estimates (reductions of up to 31.2%) and decreased between-study heterogeneity (up to 51.1%).
Meaning
Implausible effects can substantially bias meta-analytic evidence. A simple flagging tool can help identify such effects and improve quality control in evidence syntheses. The tool has been integrated into an open-source R package for routine use.