Are Medical School Preclinical Tests Biased for Sex and Race? A Differential Item Functioning Analysis

Read the full article See related articles

Listed in

This article is not in any list yet, why not save it to one of your lists.
Log in to save this article

Abstract

Background: There has been recently a common practice in assessment development to ascertain that test items function equally across test-takers’ subgroups, which is fundamental for fairness, and consequently validity of test score interpretations and uses. Accordingly, we conducted differential item functioning (DIF) analysis for three preclinical medical school foundational courses based on students’ sex and race. Methods: The sample included 520, 519, and 344 medical students for anatomy, histology, and physiology, respectively, collected from 2018-2020. To conduct DIF analysis, we used the IRTPRO software based on the item response theory two-parameter logistic model. Results : The three assessments had as many as one-fifth of the items that functioned differentially across one or more of the variables sex and race: 10 (20%) out of 49 items, six (15%) out of 40 items, 5 (11%) out of 45 items showed statistically significant DIF for Anatomy , Histology , and Physiology courses, respectively. Measurement specialists and subject matter experts independently reviewed the items to identify construct-irrelevant factors as potential sources for DIF. Most identified items were generally poorly written or had unclear images. Conclusions: The validity of score-based inferences, particularly for subgroup comparisons, requires test items to function equally across examinee subgroups. In the present study, we found DIF of some items for sex and race in three content areas. The present approach should be explored in other medical schools to address the generalizability of the present findings. Item level DIF should also be routinely conducted as part of psychometric analyses for basic sciences courses and other assessments.

Article activity feed