“Does it feel like a scientific paper?”: A qualitative analysis of preprint servers’ moderation and quality assurance processes

This article has been Reviewed by the following groups

Read the full article

Abstract

In recent years, preprints—i.e., scholarly manuscripts that have not been peer reviewed or published in a journal—have emerged as a major source of research communication and a critical component of open science. However, concerns have been raised about preprints’ potential to facilitate the spread of flawed or misleading research due to the lack of quality control performed by preprint servers. Yet, there is limited knowledge of how servers currently vet incoming content and how this impacts the openness and diversity of scholarly content. In this paper, we examine preprint servers’ moderation processes, the intentions underpinning them, and their potential effects through a qualitative analysis of in-depth interviews with 14 key preprint server personnel. We find a wide range of moderation processes, which vary depending on specific server contexts and needs and are motivated by a desire to prevent the spread of misinformation and protect trust in preprints and servers. Participants repeatedly emphasized the difference between their moderation processes and peer review, but in practice often applied similar criteria for delineating scientific from unscientific content. Moreover, moderation processes often relied on trust cues, such as article formats or author affiliations, as proxies for research quality, potentially introducing similar biases as have been found in traditional journal peer review. We discuss implications for the diversity of preprint content and authors, as well as the future of preprint servers within an evolving scholarly communication ecosystem.

Article activity feed

  1. This Zenodo record is a permanently preserved version of a PREreview. You can view the complete PREreview at https://prereview.org/reviews/14226929.

    Summary

    This study investigates the moderation processes of 13 preprint servers based on semi-structured interviews with 14 diverse representatives, uncovering a spectrum of practices shaped by local, disciplinary, and organizational contexts. It highlights key tensions in preprint moderation, such as balancing openness with quality assurance, and examines how these processes compare to peer review.

    Positive feedback

    Overall

    "This is a very interesting and important study at an apt time when preprint servers are popular and funders are adopting preprint policies."

    Methodology

    "The study includes good qualitative methods and in-depth interviews with key personnel across diverse preprint servers."

    Discussion

    "I appreciate the balanced and thoughtful discussion and conclusion."

    Major issues

    Data Presentation

    Comment: "It is so hard to work out which servers are in which category referred to in the text, and some key information is buried in the supplemental file."

    Suggestion: Move key information, such as server classifications and moderation workflows, from supplemental materials into the main text. Summarize these in a table or visual format.

    Clarification of Peer Review and Moderation Boundaries

    Comment: "Some more discussion on where the boundary between [moderation and peer review] might lie would be useful."

    Suggestion: Discuss of how moderation criteria (e.g., scholarly content, novelty) align with or diverge from peer review.

    Minor issues

    Terminology Precision

    Comment: "The authors use 'scholarly content' and 'scientific research' interchangeably."

    Suggestion: Define key terms explicitly and apply them consistently

    Visualization of Findings

    Comment: "The paper would benefit from a clear visualization of the moderation workflows across different servers."

    Suggestion: Add a flowchart or infographic to illustrate the moderation steps and criteria employed by various servers.

    Scope Column for Tables

    Comment: "Consider adding a scope column to Table 1 to clarify the disciplinary and/or geographical requirements of the preprint servers."

    Suggestion: Add a "scope" column to highlight these aspects in tables summarizing server characteristics.

    Acronym

    Comment: "What does STS mean?"

    Suggestion: Spell out STS.

    References

    Comment: "Not in reference list."

    Suggestion: Add these parenthetical citations to References list:

    (Ball 2021; Gibbons 1999) (Pereira and de Oliveira 2024; Swire-Thompson and  Lazer 2020).

    Numbers don't add up

    Text: "Ten participants chose option 1, three chose option 2, and  zero chose option 3."

    Comment: "Did one participant decline to choose between the levels?"

    Suggestion": Double-check the dataset for accuracy/report the missing participant's choice.

    Table 1 Clarity

    Column 3: Country

    Comment: "Is this the country where the preprint service is held or the owning organization?"

    Suggestion: Clarify the meaning of "Country" in Table 1 caption.

    Table from OSF Materials

    5.3.1: Moderation Criteria with OSF link  https://osf.io/drtj6/

    Comment: "The Moderation Workflows table is really helpful in understanding the discovery of the interviews. I would recommend embedding it into the article or highlighting the link more prominently."

    Suggestion: Make this edit as four others agreed with it. I, as the synthesizer, recommend using numbered lists and bullet points for columns 4-5 in that table to ease skimming.

    Table Citations

    Text: 5.3.2 … "Tables 3 and 4 (Appendix) "

    Comments: Should this be "Tables 2 and 3"? Is Table 4 (Appendix) referring to the Moderation Workflows supplement?"

    Suggestion: Check this table citation and correct if needed.

    Fairness

    Text: 5.3.3. "It is therefore neither scalable…"

    Comments: "Some consideration should be given here to the other modes of author verification not being applicable in this context. It's a unique solution for a reason and works well in a context where the other approaches listed may not. I think this sentence comes across as too dismissive of the approach of RINarxiv."

    Suggestion: Discuss the implications between co-authors to temper them if possible.

    Align citation

    Text: Appendix → "Table 2"

    Comments: "No call out in main text"

    Suggestions: add the Table 2 citation in the main body

    Future research

    Effectiveness of Moderation Practices

    "A deeper study evaluating what moderation practices are the most effective…would be recommended to preprint servers."

    Author and Submission Outcomes

    "I would love to know where the rejected preprints go (i.e., do they end up being published as-is somewhere?)."

    Comparative Analysis with Peer Review

    "A comparison between preprint server screening and initial journal submission screening would be really valuable."

    Inclusivity of Perspectives

    "The study could have benefited from including perspectives of authors and users of preprint servers."

    Competing interests

    Ashley Farley is a Gates Foundation employee leading the preprint policy and main manager of VeriXiv in partnership with F1000 . Theodora Bloom is Executive Editor at BMJ Publishing Group and co-Founder of the medRxiv preprint server. Martyn Rittman is the former Director of preprints.org, a preprint server. Jay Patel has completed a graduate-level course under the second author, is collaborating on Ph.D. research, and is co-mentored as a Ph.D. student by the second author.