Timeline from receipt to online publication of COVID-19 original research articles

This article has been Reviewed by the following groups

Read the full article See related articles

Abstract

Objective

To examine the timeline from submission of Coronavirus Disease 2019 (COVID)-related original articles compared with non-COVID-related original articles.

Background

There have been growing concerns about the speed and rigor of the review process for COVID-related articles by journals.

Methods

Using Dimensions, an online searchable platform, we identified PubMed-indexed journals that published >50 COVID-related articles (regardless of article type) between 1/1/2020 and 5/16/2020 and had available data on the date of article receipt. For the control group, we included consecutive full-length original investigations with available receipt date (regardless of topic) published in these journals starting from 3/1/2019 until a 1:2 ratio of COVID to non-COVID-related articles per journal was achieved.

Results

The final number included 294 COVID-related full-length original investigations with available article receipt dates published in 16 journals with corresponding 588 control articles from the same journals. The median time from article receipt to online publication was 20 (11-32) days for COVID-articles vs. 119 (62-182) days for controls (P<0.001). The median time to final acceptance (available for 97% of the articles) was 13 (5-23) days for COVID vs. 102 (55-161) days for controls (P<0.001). These observations were seen across all the included journals in the analysis.

Conclusions

In this analysis of full-length original investigations published in 16 medical journals, the median time from receipt to final acceptance of COVID-related articles was 8 times faster compared to non-COVID-related articles published in a similar time frame in the previous year. Online publication was 6 times faster for COVID-related articles compared to controls.

Article activity feed

  1. SciScore for 10.1101/2020.06.22.20137653: (What is this?)

    Please note, not all rigor criteria are appropriate for all manuscripts.

    Table 1: Rigor

    Institutional Review Board Statementnot detected.
    Randomizationnot detected.
    Blindingnot detected.
    Power Analysisnot detected.
    Sex as a biological variablenot detected.

    Table 2: Resources

    Software and Algorithms
    SentencesResources
    The statistical analyses were performed using SPSS statistical package (SPSS version 25.0, IBM Inc., Armonk, NY).
    SPSS
    suggested: (SPSS, RRID:SCR_002865)

    Results from OddPub: We did not detect open data. We also did not detect open code. Researchers are encouraged to share open data when possible (see Nature blog).


    Results from LimitationRecognizer: An explicit section about the limitations of the techniques employed in this study was not found. We encourage authors to address study limitations.

    Results from TrialIdentifier: No clinical trial numbers were referenced.


    Results from Barzooka: We did not find any issues relating to the usage of bar graphs.


    Results from JetFighter: We did not find any issues relating to colormaps.


    Results from rtransparent:
    • Thank you for including a conflict of interest statement. Authors are encouraged to include this statement when submitting to a journal.
    • Thank you for including a funding statement. Authors are encouraged to include this statement when submitting to a journal.
    • No protocol registration statement was detected.

    About SciScore

    SciScore is an automated tool that is designed to assist expert reviewers by finding and presenting formulaic information scattered throughout a paper in a standard, easy to digest format. SciScore checks for the presence and correctness of RRIDs (research resource identifiers), and for rigor criteria such as sex and investigator blinding. For details on the theoretical underpinning of rigor criteria and the tools shown here, including references cited, please follow this link.