EUAdb: A resource for COVID-19 test development and comparison

This article has been Reviewed by the following groups

Read the full article See related articles

Abstract

Due to the sheer number of COVID-19 (coronavirus disease 2019) cases there is a need for increased world-wide SARS-CoV-2 testing capability that is both efficient and effective. Having open and easy access to detailed information about these tests, their sensitivity, the types of samples they use, etc. would be highly useful to ensure their reproducibility, to help clients compare and decide which tests would be best suited for their applications, and to avoid costs of reinventing similar or identical tests. Additionally, this resource would provide a means of comparing the many innovative diagnostic tools that are currently being developed in order to provide a foundation of technologies and methods for the rapid development and deployment of tests for future emerging diseases. Such a resource might thus help to avert the delays in testing and screening that was observed in the early stages of the pandemic and plausibly led to more COVID-19-related deaths than necessary. We aim to address these needs via a relational database containing standardized ontology and curated data about COVID-19 diagnostic tests that have been granted Emergency Use Authorizations (EUAs) by the FDA (US Food and Drug Administration). Simple queries of this actively growing database demonstrate considerable variation among these tests with respect to sensitivity (limits of detection, LoD), controls and targets used, criteria used for calling results, sample types, reagents and instruments, and quality and amount of information provided.

Article activity feed

  1. SciScore for 10.1101/2020.07.30.228890: (What is this?)

    Please note, not all rigor criteria are appropriate for all manuscripts.

    Table 1: Rigor

    Institutional Review Board Statementnot detected.
    Randomizationnot detected.
    Blindingnot detected.
    Power Analysisnot detected.
    Sex as a biological variablenot detected.

    Table 2: Resources

    No key resources detected.


    Results from OddPub: We did not detect open data. We also did not detect open code. Researchers are encouraged to share open data when possible (see Nature blog).


    Results from LimitationRecognizer: An explicit section about the limitations of the techniques employed in this study was not found. We encourage authors to address study limitations.

    Results from TrialIdentifier: No clinical trial numbers were referenced.


    Results from Barzooka: We did not find any issues relating to the usage of bar graphs.


    Results from JetFighter: We did not find any issues relating to colormaps.


    Results from rtransparent:
    • Thank you for including a conflict of interest statement. Authors are encouraged to include this statement when submitting to a journal.
    • Thank you for including a funding statement. Authors are encouraged to include this statement when submitting to a journal.
    • No protocol registration statement was detected.

    About SciScore

    SciScore is an automated tool that is designed to assist expert reviewers by finding and presenting formulaic information scattered throughout a paper in a standard, easy to digest format. SciScore checks for the presence and correctness of RRIDs (research resource identifiers), and for rigor criteria such as sex and investigator blinding. For details on the theoretical underpinning of rigor criteria and the tools shown here, including references cited, please follow this link.