A Web-Based, Mobile-Responsive Application to Screen Health Care Workers for COVID-19 Symptoms: Rapid Design, Deployment, and Usage

This article has been Reviewed by the following groups

Read the full article

Abstract

As of July 17, 2020, the COVID-19 pandemic has affected over 14 million people worldwide, with over 3.68 million cases in the United States. As the number of COVID-19 cases increased in Massachusetts, the Massachusetts Department of Public Health mandated that all health care workers be screened for symptoms daily prior to entering any hospital or health care facility. We rapidly created a digital COVID-19 symptom screening tool to enable this screening for a large, academic, integrated health care delivery system, Partners HealthCare, in Boston, Massachusetts.

Objective

The aim of this study is to describe the design and development of the COVID Pass COVID-19 symptom screening application and report aggregate usage data from the first three months of its use across the organization.

Methods

Using agile principles, we designed, tested, and implemented a solution over the span of one week using progressively customized development approaches as the requirements and use case become more solidified. We developed the minimum viable product (MVP) of a mobile-responsive, web-based, self-service application using research electronic data capture (REDCap). For employees without access to a computer or mobile device to use the self-service application, we established a manual process where in-person, socially distanced screeners asked employees entering the site if they have symptoms and then manually recorded the responses in an Office 365 Form. A custom .NET Framework application solution was developed as COVID Pass was scaled. We collected log data from the .NET application, REDCap, and Microsoft Office 365 from the first three months of enterprise deployment (March 30 to June 30, 2020). Aggregate descriptive statistics, including overall employee attestations by day and site, employee attestations by application method (COVID Pass automatic screening vs manual screening), employee attestations by time of day, and percentage of employees reporting COVID-19 symptoms, were obtained.

Results

We rapidly created the MVP and gradually deployed it across the hospitals in our organization. By the end of the first week, the screening application was being used by over 25,000 employees each weekday. After three months, 2,169,406 attestations were recorded with COVID Pass. Over this period, 1865/160,159 employees (1.2%) reported positive symptoms. 1,976,379 of the 2,169,406 attestations (91.1%) were generated from the self-service screening application. The remainder were generated either from manual attestation processes (174,865/2,169,406, 8.1%) or COVID Pass kiosks (25,133/2,169,406, 1.2%). Hospital staff continued to work 24 hours per day, with staff attestations peaking around shift changes between 7 and 8 AM, 2 and 3 PM, 4 and 6 PM, and 11 PM and midnight.

Conclusions

Using rapid, agile development, we quickly created and deployed a dedicated employee attestation application that gained widespread adoption and use within our health system. Further, we identified 1865 symptomatic employees who otherwise may have come to work, potentially putting others at risk. We share the story of our implementation, lessons learned, and source code (via GitHub) for other institutions who may want to implement similar solutions.

Article activity feed

  1. SciScore for 10.1101/2020.04.17.20069211: (What is this?)

    Please note, not all rigor criteria are appropriate for all manuscripts.

    Table 1: Rigor

    Institutional Review Board StatementIRB: This study was approved by the Partners HealthCare Institutional Review Board.
    Randomizationnot detected.
    Blindingnot detected.
    Power Analysisnot detected.
    Sex as a biological variablenot detected.

    Table 2: Resources

    Software and Algorithms
    SentencesResources
    7,8 The self-service application, built in REDCap, was called COVID Pass.
    REDCap
    suggested: (REDCap, RRID:SCR_003445)
    Aggregate descriptive statistics including overall employee attestations by day and site, employee attestations by application method (COVID Pass automatic screening vs. manual screening), employee attestations by time of day, and % of employees reporting COVID-19 symptoms were compiled using SAS (Enterprise Guide 7.1, SAS Institute Inc, Cary, NC) and Microsoft Excel (Microsoft Corporation, Redmond, WA).
    SAS Institute
    suggested: (Statistical Analysis System, RRID:SCR_008567)
    Microsoft Excel
    suggested: (Microsoft Excel, RRID:SCR_016137)

    Results from OddPub: Thank you for sharing your code.


    Results from LimitationRecognizer: We detected the following sentences addressing limitations in the study:
    As part of this work, we also knew that we would need to offer an accessible pathway for users who may not be able to complete the self-service COVID Pass due to limited proficiency with a smartphone or computer, language limitations, or other reasons. By incorporating this accessibility requirement early on in our development and creating a manual pathway, COVID Pass could then be a comprehensive solution for all employees. Adopting an agile development process to further minimize friction for the end user through early on-site testing and rapid iteration cycles. An agile development approach emphasizes “early and continuous delivery of valuable software” while accommodating for changing requirements as software is utilized in the real world and assumptions are validated or invalidated. 9 Our team created the MVP for COVID Pass with 48 hours and began testing the solution the very next day. We refined the requirements for COVID Pass and made updates to the MVP multiple times throughout the day at the beginning of this process. Much of this early ability to create such rapid changes and adjustments to COVID Pass came from the initial platform decision to utilize REDCap for the MVP. As the core functionality of REDCap was able to accommodate most of the initial requirements of COVID Pass, adjusting the symptom survey fields and conditional text could be done almost instantaneously. This enabled our team to sustain a rapid iteration cycle during the first week of deployment whi...

    Results from TrialIdentifier: No clinical trial numbers were referenced.


    Results from Barzooka: We did not find any issues relating to the usage of bar graphs.


    Results from JetFighter: We did not find any issues relating to colormaps.


    Results from rtransparent:
    • Thank you for including a conflict of interest statement. Authors are encouraged to include this statement when submitting to a journal.
    • Thank you for including a funding statement. Authors are encouraged to include this statement when submitting to a journal.
    • No protocol registration statement was detected.

    About SciScore

    SciScore is an automated tool that is designed to assist expert reviewers by finding and presenting formulaic information scattered throughout a paper in a standard, easy to digest format. SciScore checks for the presence and correctness of RRIDs (research resource identifiers), and for rigor criteria such as sex and investigator blinding. For details on the theoretical underpinning of rigor criteria and the tools shown here, including references cited, please follow this link.