Improving Replicability in Neuroimaging using DevOps Frameworks
Discuss this preprint
Start a discussion What are Sciety discussions?Listed in
This article is not in any list yet, why not save it to one of your lists.Abstract
The growing scale of neuroimaging datasets in contemporary neuroscience has intensified challenges in data management, processing standardisation, and reproducible workflows. DevOps practices from software engineering offer promising solutions to enhance reproducibility as neuroimaging research scales. This narrative synthesis examined DevOps integration in structural neuroimaging workflows through systematic searches of IEEE, Scopus, PubMed, and ArXiv (2016 - 2025), following PRISMA guidelines with AI-assisted data extraction. Analysis of 38 studies identified four key themes: automated pipelines and containerisation; quality control frameworks; deep learning integration with scalability challenges; and multi-site harmonisation. Studies demonstrated technical advances in reproducible, version-controlled workflows, yet highlighted significant adoption barriers, including requirements for computational expertise, institutional infrastructure limitations, and cultural reluctance towards transparent automated practices. Based on this evidence, we propose a four-domain implementation framework: (1) Technical Readiness (containerisation, version control), (2) Methodological Rigour (automated quality control and validation), (3) Open Code Transparency, including reproducible deep-learning pipelines and community-driven maintenance, and (4) Cultural and Institutional Support (training, collaborative practices, sustained funding). Successful DevOps adoption requires continuous improvement across all domains with commitment and support from researchers, institutions, and funding bodies. This synthesis provides evidence-based guidance for modernising neuroimaging workflows while strengthening scientific rigour.