Deep convolutional approaches for the analysis of Covid-19 using chest X-Ray images from portable devices

This article has been Reviewed by the following groups

Read the full article

Abstract

The recent human coronavirus disease (COVID-19) caused by severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) has been declared as a global pandemic on 11 March 2020 by the World Health Organization. Given the effects of COVID-19 in pulmonary tissues, chest radiography imaging plays an important role for the screening, early detection and monitoring of the suspected individuals. Hence, as the pandemic of COVID-19 progresses, there will be a greater reliance on the use of portable equipment for the acquisition of chest X-Ray images due to its accessibility, widespread availability and benefits regarding to infection control issues, minimizing the risk of cross contamination. This work presents novel fully automatic approaches specifically tailored for the classification of chest X-Ray images acquired by portable equipment into 3 different clinical categories: normal, pathological and COVID-19. For this purpose, two complementary deep learning approaches based on a densely convolutional network architecture are herein presented. The joint response of both approaches allows to enhance the differentiation between patients infected with COVID-19, patients with other diseases that manifest characteristics similar to COVID-19 and normal cases. The proposed approaches were validated over a dataset provided by the Radiology Service of the Complexo Hospitalario Universitario A Coruña (CHUAC) specifically retrieved for this research. Despite the poor quality of chest X-Ray images that is inherent to the nature of the portable equipment, the proposed approaches provided satisfactory results, allowing a reliable analysis of portable radiographs, to support the clinical decision-making process.

Article activity feed

  1. SciScore for 10.1101/2020.06.18.20134593: (What is this?)

    Please note, not all rigor criteria are appropriate for all manuscripts.

    Table 1: Rigor

    NIH rigor criteria are not applicable to paper type.

    Table 2: Resources

    No key resources detected.


    Results from OddPub: We did not detect open data. We also did not detect open code. Researchers are encouraged to share open data when possible (see Nature blog).


    Results from LimitationRecognizer: An explicit section about the limitations of the techniques employed in this study was not found. We encourage authors to address study limitations.

    Results from TrialIdentifier: No clinical trial numbers were referenced.


    Results from Barzooka: We did not find any issues relating to the usage of bar graphs.


    Results from JetFighter: We did not find any issues relating to colormaps.


    Results from rtransparent:
    • Thank you for including a conflict of interest statement. Authors are encouraged to include this statement when submitting to a journal.
    • Thank you for including a funding statement. Authors are encouraged to include this statement when submitting to a journal.
    • No protocol registration statement was detected.

    About SciScore

    SciScore is an automated tool that is designed to assist expert reviewers by finding and presenting formulaic information scattered throughout a paper in a standard, easy to digest format. SciScore checks for the presence and correctness of RRIDs (research resource identifiers), and for rigor criteria such as sex and investigator blinding. For details on the theoretical underpinning of rigor criteria and the tools shown here, including references cited, please follow this link.