Dynamics of the COVID -19 Related Publications

This article has been Reviewed by the following groups

Read the full article

Abstract

Background

This study aims to analyze the dynamics of the published articles and preprints of Covid-19 related literature from different scientific databases and sharing platforms.

Methods

The PubMed, Elsevier, and Research Gate (RG) databases were under consideration in this study over a specific time. Analyses were carried out on the number of publications as (a) function of time (day), (b) journals and (c) authors. Doubling time of the number of publications was analyzed for PubMed “all articles” and Elsevier published articles. Analyzed databases were (1A) PubMed “all articles” (01/12/2019-12/06/2020) (1B) PubMed Review articles (01/12/2019-2/5/2020) and (1C) PubMed Clinical Trials (01/01/2020-30/06/2020) (2) Elsevier all publications (01/12/2019-25/05/2020) (3) RG (Article, Pre Print, Technical Report) (15/04/2020–30/4/2020).

Findings

Total publications in the observation period for PubMed, Elsevier, and RG were 23000, 5898 and 5393 respectively. The average number of publications/day for PubMed, Elsevier and RG were 70.0 ±128.6, 77.6±125.3 and 255.6±205.8 respectively. PubMed shows an avalanche in the number of publication around May 10, number of publications jumped from 6.0±8.4/day to 282.5±110.3/day. The average doubling time for PubMed, Elsevier, and RG was 10.3±4 days, 20.6 days, and 2.3±2.0 days respectively. In PubMed average articles/journal was 5.2±10.3 and top 20 authors representing 935 articles are of Chinese descent. The average number of publications per author for PubMed, Elsevier, and RG was 1.2±1.4, 1.3±0.9, and 1.1±0.4 respectively. Subgroup analysis, PubMed review articles mean and median review time for each article were <0|17±17|77> and 13.9 days respectively; and reducing at a rate of-0.21 days (count)/day.

Interpretation

Although the disease has been known for around 6 months, the number of publications related to the Covid-19 until now is huge and growing very fast with time. It is essential to rationalize the publications scientifically by the researchers, authors, reviewers, and publishing houses.

Funding

None

Article activity feed

  1. SciScore for 10.1101/2020.08.05.237313: (What is this?)

    Please note, not all rigor criteria are appropriate for all manuscripts.

    Table 1: Rigor

    Institutional Review Board Statementnot detected.
    Randomizationnot detected.
    Blindingnot detected.
    Power Analysisnot detected.
    Sex as a biological variablenot detected.

    Table 2: Resources

    Software and Algorithms
    SentencesResources
    Number of publications as a function of doubling time was analyzed for PubMed “all articles” and Elsevier published articles.
    PubMed
    suggested: (PubMed, RRID:SCR_004846)

    Results from OddPub: We did not detect open data. We also did not detect open code. Researchers are encouraged to share open data when possible (see Nature blog).


    Results from LimitationRecognizer: An explicit section about the limitations of the techniques employed in this study was not found. We encourage authors to address study limitations.

    Results from TrialIdentifier: No clinical trial numbers were referenced.


    Results from Barzooka: We did not find any issues relating to the usage of bar graphs.


    Results from JetFighter: We did not find any issues relating to colormaps.


    Results from rtransparent:
    • Thank you for including a conflict of interest statement. Authors are encouraged to include this statement when submitting to a journal.
    • Thank you for including a funding statement. Authors are encouraged to include this statement when submitting to a journal.
    • No protocol registration statement was detected.

    About SciScore

    SciScore is an automated tool that is designed to assist expert reviewers by finding and presenting formulaic information scattered throughout a paper in a standard, easy to digest format. SciScore checks for the presence and correctness of RRIDs (research resource identifiers), and for rigor criteria such as sex and investigator blinding. For details on the theoretical underpinning of rigor criteria and the tools shown here, including references cited, please follow this link.