Do physician associates and medical graduates have comparable knowledge? A re-analysis of progress test data

Read the full article See related articles

Listed in

This article is not in any list yet, why not save it to one of your lists.
Log in to save this article

Abstract

Objective To re-evaluate the claim that physician associate (PA) and medical graduates have "comparable knowledge" by analysing the correlation patterns of their performance on a standardised multiple choice question test, rather than relying solely on mean scores. Design Secondary cross-sectional analysis of publicly available data from a single UK institution. Participants Progress test results from 96 PA students (across two stages), 1,195 medical students (stages 1–5), and 65 Foundation Year 1 (FY1) doctors. Main Outcome Measures Pearson correlation coefficients of question-by-question performance between groups, supplemented with mean scores on an 88-item multiple-choice test. Results Despite similar mean scores between second-year PAs and fourth-year medical students, their question-by-question performance showed no correlation. PA performance was also uncorrelated with that of final-year medical students (r = 0.045) and FY1 doctors (r = 0.008), in stark contrast to the strong correlation observed between medical students and FY1 doctors (r = 0.927). These divergent patterns led to extreme performance differences on specific questions. For example, on one question (M3433) PAs scored 89% while final-year medical students scored 5%. Conversely, on another (M3497), PAs scored 2% while medical students scored 95%. Conclusion Relying on mean scores alone to assess knowledge equivalence is misleading. Correlation analysis reveals that the PAs and medical graduates in this cohort possess fundamentally different knowledge structures, demonstrating distinct patterns of strengths and weaknesses. These data do not support the claim that the two professions have comparable knowledge bases.

Article activity feed