Strategies for Creating Uncertainty in the AI Era to Trigger Students’ Critical Thinking: Pedagogical Design, Assessment Rubric, and Exam System

Read the full article See related articles

Discuss this preprint

Start a discussion What are Sciety discussions?

Listed in

This article is not in any list yet, why not save it to one of your lists.
Log in to save this article

Abstract

Generative AI challenges traditional assessments by allowing students to produce correct answers without demonstratingunderstanding or reasoning. Rather than prohibiting AI, this work argues that one way to integrate AI into educationis by creating uncertain situations with the help of AI models and using thinking-oriented teaching approaches, whereuncertainty is a central pedagogical concept for stimulating students’ critical thinking. Drawing on epistemologyand critical thinking research studies, we propose designing learning activities and assessments around the inherentlimitations of both AI models and instructors. This encourages students to reason, question, and justify their finalanswers. We show how explicitly controlling AI behavior during exams (such as preventing direct answers or generatingplausible but flawed responses) prevents AI from becoming a shortcut to certainty. To support this pedagogy, weintroduce MindMosaicAIExam, an exam system that integrates controllable AI tools and requires students to provideinitial answers, critically evaluate AI outputs, and iteratively refine their reasoning. We also present an evaluationrubric designed to assess critical thinking based on students’ reasoning artifacts collected by the exam system.

Article activity feed