Multivariate Bayesian Inversion for Classification and Regression

Read the full article See related articles

Listed in

This article is not in any list yet, why not save it to one of your lists.
Log in to save this article

Abstract

We propose the statistical modelling approach to supervised learning (i.e. predicting labels from features) as an alternative to algorithmic machine learning (ML). The approach is demonstrated by employing a multivariate general linear model (MGLM) describing the effects of labels on features, possibly accounting for covariates of no interest, in combination with prior distributions on the model parameters. ML “training” is translated into estimating the MGLM parameters via Bayesian inference and ML “testing” or application is translated into Bayesian model comparison – a reciprocal relationship we refer to as multivariate Bayesian inversion (MBI). We devise MBI algorithms for the standard cases of supervised learning, discrete classification and continuous regression, derive novel classification rules and regression predictions, and use practical examples (simulated and real data) to illustrate the benefits of the statistical modelling approach: interpretability, incorporation of prior knowledge, and probabilistic predictions. We close by discussing further advantages, disadvantages and the future potential of MBI.

Article activity feed