A Proposed Method for Automating Website Usability Assessments in Healthcare with a Large language Model: Observational Study of Global Emergency Medicine Fellowship Programs

Read the full article

Discuss this preprint

Start a discussion What are Sciety discussions?

Listed in

This article is not in any list yet, why not save it to one of your lists.
Log in to save this article

Abstract

Introduction The Internet has become a key resource for individuals managing their healthcare needs, with hospital websites serving as critical access points for health information. Usability factors such as readability, accessibility, and content quality significantly impact user experience and patient decision-making. While previous studies have assessed website usability using manual methodologies, these processes are time-consuming and inefficient. This study aims to automate an established usability scoring methodology for healthcare websites, focusing on 30 Global Emergency Medicine Fellowship websites. Method This study manually compiled a dataset of URLs from institutions offering Global Emergency Medicine Fellowship programs sourced from the SAEM website. An automated process assessed website usability, focusing on accessibility, marketing (SEO), content quality, and technology. Tools like Python libraries Requests, BeautifulSoup, OpenAI’s GPT-3.5-turbo, and Textstat were used for data extraction, grammar checks, and readability analysis. Data analysis consisted of SEO factors, multimedia content, website loading times, and broken links. Result The mean usability score was 72.4 ± 8.3 (range: 58–89), with confidence intervals at 95% (67.3–73.7). Accessibility scores varied greatly (0–206.8), showing inconsistent support for assistive needs. The average SEO score was low at 13.03 ± 8.77; only 40% used proper meta descriptions and alt-texts. Multimedia use was limited with an average score of 7.89 (range: 0–26), and only 35% had updated fellowship details. Websites using AI features like chatbots showed a 15% drop in bounce rates and a 20% rise in time on site. Automation reduced analysis time by 45% (p < 0.05) versus manual review. Conclusion. This study highlights the critical role of usability, accessibility, and content quality in engaging prospective applicants. Variability in design, readability, and SEO underscores the need for standardized, user-centered development. Future research should explore further optimization of these tools and their potential application across other medical specialties.

Article activity feed