Statistical Properties and Power Analysis of Divergence Measures for Credit Risk Model Monitoring
Discuss this preprint
Start a discussion What are Sciety discussions?Listed in
This article is not in any list yet, why not save it to one of your lists.Abstract
Divergence measures are essential tools for detecting distributional shifts in model monitoring, particularly crucial given the volatility of financial data. We argue that JSD offers a more robust metric for mixture models by resolving common zero-frequency artifacts, while KLD remains the optimal choice for Bayesian frameworks. This study extends the work of [1] with two primary contributions: First, we derive the statistical properties and chi-square benchmark values for JSD and KLD. Second, we demonstrate their applicability by detecting distributional changes in credit default probabilities from Merton, Merton with jump, and stochastic volatility with jump models. Our results establish that JSD and KLD follow chi-square distributions and reveal important practical trade-offs. JSD exhibits superior Type I error control, maintaining rejection rates closest to 5%, thereby minimizing false positives. However, this conservatism reduces statistical power at small samples, requiring larger samples for reliable detection. This trade-off enables practitioners to select measures based on whether minimizing false alarms or maximizing detection sensitivity is prioritized.