Analytical Flexibility and the Crisis of Research Integrity: An Institutional Audit of the British Columbia Centre on Substance Use (2008–2025)

Read the full article See related articles

Discuss this preprint

Start a discussion What are Sciety discussions?

Listed in

This article is not in any list yet, why not save it to one of your lists.
Log in to save this article

Abstract

Background:The reproducibility crisis has emerged as a defining challenge in contemporary science, with mounting evidence of questionable research practices across disciplines. Substance use research, which directly informs life-and-death policy decisions for marginalized populations, remains particularly vulnerable to analytical flexibility that can compromise evidence quality. This audit evaluated the consistency, transparency and methodological rigour of a sub-set of quantitative, regression-based studies conducted by the British Columbia Centre on Substance Use (BCCSU) between 2008 and 2025.Methods:We systematically reviewed 85 peer-reviewed publications affiliated with BCCSU researchers employing cross-sectional data from major cohorts (VIDUS, ACCESS, ARYS). Studies were assessed for pre-registration status, variable selection methods, transformation justifications, sampling timeframe rationale, missing data handling and data/code transparency. Structured metadata extraction enabled quantification of researcher degrees of freedom across the institutional portfolio.Results:Zero studies (0%) were pre-registered on public platforms. Fifty-one publications (58%) relied on automated stepwise selection methods (AIC/QIC minimization) rather than theory-driven covariate specification (3.4%). Critical methodological decisions lacked transparency: 36% of studies employed unjustified variable transformations of drug use measures; 34% of studies did not report missing data handling approaches; 66% provided no justification for sampling start dates; 69% provided no justification for sampling end dates. These researcher degrees of freedom produced pervasive analytic heterogeneity that rendered the evidence base largely unsynthesizable.Conclusions:The findings reveal systemic patterns that reflect institutional incentive structures prioritizing publication quantity over methodological rigour. Widespread reliance on data-driven model selection creates a high-risk environment for p-hacking and other questionable research practices. Radical reform via preregistration mandates, standardized reporting templates and annual integrity metrics is required to restore public trust and produce policy-actionable science capable of serving vulnerable populations.

Article activity feed