Fast uncertainty quantification in EZ cognitive models

Read the full article See related articles

Discuss this preprint

Start a discussion What are Sciety discussions?

Listed in

This article is not in any list yet, why not save it to one of your lists.
Log in to save this article

Abstract

Classical approaches to uncertainty quantification in cognitive modeling rely on computationally expensive Monte Carlo methods or resampling of raw trial data, which creates a barrier for real-time analysis and large-scale studies. We present a computationally efficient bootstrap method that operates directly on summary statistics, exploiting the synthetic likelihood structure of a small class of cognitive models that includes the simple diffusion model and the circular diffusion model in addition to signal detection and multinomial processing trees. The method relies on a numerical transformation-of-variables technique in which known sampling distributions of summary statistics of behavior are parametrically bootstrapped, after which the resampled statistics are transformed to parameter estimates with a known analytical system. This approach does not require additional assumptions beyond those already made by the models themselves, but achieves over 1000-fold speed improvements over already efficient fully Bayesian methods. The proposed method makes real-time uncertainty quantification accessible and enables new applications in adaptive testing, meta-analyses, and exploratory data analysis.

Article activity feed