Peer-rubric evaluation in dissection courses: enhancing student performance and engagement

Read the full article See related articles

Discuss this preprint

Start a discussion What are Sciety discussions?

Listed in

This article is not in any list yet, why not save it to one of your lists.
Log in to save this article

Abstract

Background : Medical education increasingly incorporates active learning strategies, with dissection courses remaining essential for anatomical understanding. However, traditional dissection often lacks structured feedback. This study investigated the effectiveness of a peer-rubric evaluation on learning and motivation outcomes in a second-year medical course on the gastro-intestinal system. Methods: Students were randomized into a rubric group (n=105), who received peer evaluation based on a five-criterion rubric across seven dissection sessions, and a control group (n=309). Academic performance was assessed using a pretest, a practical posttest, and a theoretical examination. Student and evaluator experiences with the rubric were collected through surveys. Group differences in posttest and theoretical examination scores were analyzed using an independent Student’s T-test. Pearson correlation analysis assessed the relationship between difference in rubric scores and academic performance. A linear mixed-model analysis was conducted to evaluate trends in rubric scores across sessions. Results: No statistically significant differences were found in posttest (rubric group mean 15.50 ± 2.30/20; control 15.44 ± 2.75/20, p=0.86) or theory exam scores (rubric 14.22 ± 2.75/20; control 13.87 ± 3.27/20, p=0.33). A weak positive correlation (r=0.20, p=0.04) was observed between improvement in rubric scores over time and posttest and theory performance in the rubric group. Linear mixed-model analysis showed a significant upward trend in rubric scores across sessions (p<0.001), indicating improved performance over time. Survey data revealed mixed perceptions regarding peer evaluation’s objectivity and learning impact. Some students valued its role in promoting engagement, while others were sceptical about fairness and influence on final scores. Evaluators found the rubric clarified expectations but noted issues with grading consistency and workload. Conclusions: Despite no significant effect on summative outcomes, peer-rubric evaluation appears to enhance learning progression during practical sessions. Refining rubric design, strengthening assessor training, and adjusting assessment weighting may improve its effectiveness in anatomy education.

Article activity feed