A tutorial on distribution-free uncertainty quantification using conformal prediction

Read the full article See related articles

Discuss this preprint

Start a discussion What are Sciety discussions?

Listed in

This article is not in any list yet, why not save it to one of your lists.
Log in to save this article

Abstract

Statistical prediction models are ubiquitous in psychological research and practice.Increasingly, machine learning models are used. Quantifying the uncertainty of suchpredictions is rarely considered, partly because prediction intervals are not defined for manyof the algorithms used. However, generating and reporting prediction models withoutinformation on the uncertainty of the predictions carries the risk of over-interpreting theiraccuracy. Conventional methods for prediction intervals (such as those defined for OrdinaryLeast Squares regression) are sensitive to violations of several distributional assumptions.This tutorial introduced conformal prediction, a model-agnostic, distribution-free method forgenerating prediction intervals with guaranteed marginal coverage, to psychologicalresearch. We start by introducing the basic rationale of prediction intervals using amotivating example. Then, we proceed to conformal prediction, which is illustrated in threeincreasingly complex examples, using publicly available data and R code.

Article activity feed