Predicting agranulocytosis in patients treated with clozapine – development and validation of a machine learning algorithm based on 5,550 patients
Listed in
This article is not in any list yet, why not save it to one of your lists.Abstract
Background
To prevent clozapine-induced agranulocytosis (CIA), patients’ white blood cell counts are closely monitored, with treatment stopped if the absolute neutrophil count (ANC) drops below 1.5×10 9 /L. While effective, this approach has a high rate of false positives. This study aimed to develop a machine learning (ML) decision-making tool to better predict CIA risk using pattern-based criteria (two consecutive ANCs <0.5×10 9 /L over ≥2 days).
Methods
Using a ML technique [gradient-boosted decision trees (GDBT)] we analysed clinical data from 5,550 UK patients treated with clozapine: 2,190 controls with no history of neutropenia and 3,360 cases with at least one neutropenic event, including 358 with pattern-based CIA. Using haematological and demographic data from the current and three prior time windows, predictive models estimated the likelihood of CIA across four time-windows: 1 week, 2 weeks, 1 month, and 3 months respectively in advance. Model performance was evaluated using area under the receiver operator characteristic curve (AUROC), sensitivity, and specificity. We developed another model to predict baseline risk of CIA and compared performance with genetic tests. Explainability analyses identified key features influencing predictions.
Outcomes
GDBT models demonstrated strong predictive performance: 1-week forecasting horizon: AUROC 0.99 [95% confidence interval (CI): 0.99–0.99]; 2 weeks: AUROC 0.97 [95% CI: 0.95–0.99]; 1 month: AUROC 0.91 [95% CI: 0.86–0.94]; 3 months: AUROC 0.90 [95% CI: 0.88–0.92]. The baseline model achieved better performance than current genetic tests, with high specificity and sensitivity at varying thresholds. Key discriminative features for CIA included age and baseline haematological values for longer forecasting horizons (1 and 3 months) and current haematological values and treatment duration for shorter horizons (1 and 2 weeks).
Interpretation
ML models reliably predict CIA occurrence across short- and long-term horizons, potentially reducing the number of false positives with the current system. Implementation of ML models can reduce unnecessary treatment interruptions and the need for additional blood tests due to suspected agranulocytosis.
Funding
The study did not receive direct funding.
Research in context
Evidence before this study
The only antipsychotic that is effective for treatmentresistant psychosis is clozapine. Tragically, many patients with treatment-resistant psychosis never receive clozapine treatment or receive it many years after “treatmentresistance”. A prominent reason for this is blood tests that are required to detect potential clozapine-induced agranulocytosis (CIA). Despite monitoring being effective, several patients have had to stop clozapine unnecessarily because of the current haematological criteria for discontinuation. In many of these patients, this has resulted in poor clinical and social outcomes. Additionally, many cases of agranulocytosis are identified late under the existing monitoring protocols. At present, there is no reliable way of predicting clozapine-induced agranulocytosis (CIA).
Added value of this study
This is the first study to propose that a machine-learning decision tool can reliably predict CIA before it occurs in both the short term and long term.
Implications of all the available evidence
Implementation of machine learning algorithms allow prediction of agranulocytosis so that clozapine can be appropriate stopped before it occurs. The algorithm can also prevent unnecessary stopping of clozapine and additional blood testing that is related to spurious blood results.