Methodological Analysis of Bias Risks in Adaptive Multi-Arm Platform Trials: A Case-Series from Three COVID-19 Studies

Read the full article See related articles

Discuss this preprint

Start a discussion What are Sciety discussions?

Listed in

This article is not in any list yet, why not save it to one of your lists.
Log in to save this article

Abstract

Background: Adaptive platform trials (APTs) were widely promoted during the COVID-19 pandemic as an efficient and flexible approach for evaluating multiple interventions. However, their complexity and frequent protocol amendments may introduce risks of bias. We selected ivermectin as our case study both because of the marked discrepancy between the generally positive findings in the broader literature and the negative results from a few large trials, and because these influential studies were all designed as adaptive platform trials with multiple arms.Methods: We performed a methodological case-series analysis of three major adaptive platform trials: ACTIV-6 (USA), PRINCIPLE (UK), and TOGETHER (Brazil), each of which included an ivermectin arm for outpatient COVID-19. For each trial, we systematically reviewed all publicly available protocols, registry entries, analysis plans, preprints, and publications. We identified key protocol changes, design limitations, and reporting issues, and summarized recurring patterns related to endpoints, dosing, treatment timing, randomization, reporting, and data transparency. Each issue was assessed for its likely direction of bias on the observed efficacy of ivermectin. Results: All three trials exhibited extensive protocol changes, outcome modifications, and reporting inconsistencies, many of which occurred after participant enrollment had begun. Common issues included delayed treatment initiation, suboptimal dosing or administration, selective reporting of results, and inconsistent application of early stopping rules. Positive findings in secondary outcomes or subgroups were often omitted or downplayed in final publications. Data inconsistencies and lack of transparency further limited independent verification. The cumulative effect of these issues was a systematic bias toward underestimating any potential benefit of ivermectin.Conclusion: Our analysis shows that even large adaptive platform trials can be compromised by methodological flaws, losing their intended advantages and potentially distorting the evidence base. In all cases, these flaws systematically biased the assessment of ivermectin, meaning the results of these trials cannot be regarded as definitive evidence against its potential benefit. We propose that small but well-designed randomized trials, when combined in meta-analyses, may provide a more reliable foundation for evidence, even if individual studies are underpowered. Above all, critical appraisal and transparency are essential for trustworthy clinical research and sound policy decisions.

Article activity feed