An introduction to Sequential Monte Carlo for Bayesian inference and model comparison -- with examples for psychology and behavioural science

Read the full article See related articles

Listed in

This article is not in any list yet, why not save it to one of your lists.
Log in to save this article

Abstract

Bayesian inference is becoming an increasingly popular framework for statistics in the behavioural sciences. However, its application is hampered by its computational intractability -- almost all Bayesian analyses require a form of approximation. While some of these approximate inference algorithms, such as Markov chain Monte Carlo (MCMC), have become well-known throughout the literature, other approaches exist that are not as widespread. Here, we provide an introduction to another family of approximate inference techniques known as Sequential Monte Carlo (SMC). We show that SMC brings a number of benefits, which we illustrate in three different examples: linear regression and variable selection for depression, growth curve mixture modelling of grade point averages, and in computational modelling of the Iowa Gambling Task. These use cases demonstrate that SMC is efficient in exploring posterior distributions, reaching similar predictive performance as state-of-the-art MCMC approaches, in less wall-clock time. Moreover, they show that SMC is effective in dealing with multi-modal distributions, and that SMC not only approximates the posterior distribution, but simultaneously provides a useful estimate of the marginal likelihood, which is the essential quantity in Bayesian model comparison. All of this comes at no additional effort of the end user.

Article activity feed