Gradient boosting for generalised additive mixed models
Listed in
This article is not in any list yet, why not save it to one of your lists.Abstract
Generalised additive mixed models are a common tool for modelling of grouped or longitudinal data where random effects are incorporated into the model in order to account for within-group or inter-individual correlations. As an alternative to established penalised maximum likelihood approaches, several different types of boosting routines have been developed to make more demanding data situations manageable. However, when estimating mixed models with component-wise gradient boosting, random and fixed effects compete within the variable selection mechanism. This can result in irregular selection properties and biased parameter estimates, particularly when covariates are constant within clusters. Moreover, while researchers typically are more interested in the covariance structure of random effects than in the effects themselves, current gradient boosting implementations focus solely on estimating the random effects. To overcome these drawbacks we propose a novel gradient boosting scheme for generalized additive mixed models. This novel approach is implemented as an R-package mermboost, seamlessly wrapped around the established mboost framework, maintaining its flexibility while enhancing functionality. The improved performance of the new framework is shown via an extensive simulation study and real world applications.