Mitigating Lipschitz Singularities in Long-Tailed Diffusion Models via Time-Step Sharing Strategy

Read the full article See related articles

Listed in

This article is not in any list yet, why not save it to one of your lists.
Log in to save this article

Abstract

Diffusion models have demonstrated remarkable success in generating diverse and high-fidelity images. However, their performance often degrades when applied to long-tailed datasets, where head classes significantly outnumber tail classes. This imbalance leads to biased model training, favoring head classes and neglecting tail classes. To address this challenge, we conduct an in-depth analysis of the Lipschitz singularity problem that arises near zero timesteps in long-tailed diffusion models, causing numerical instability and degraded image quality. We propose a time-step sharing based class-balancing diffusion model (TCDM) that effectively mitigates this issue by combining a shared timestep strategy with a conditional probability transfer mechanism. TCDM improves noise prediction accuracy and information transfer stability, leading to enhanced image generation quality for tail classes. Experimental results on multiple long-tailed datasets, including CIFAR-100LT and CIFAR-10LT, demonstrate TCDM's superior performance, achieving leading FID scores and higher recall rates compared to existing methods. Here, we show that TCDM significantly reduces the Lipschitz constant near zero, ensuring more stable and accurate noise prediction and feature transfer. This research contributes to the broader field of generative models by providing a robust solution for handling long-tailed distributions in image generation tasks. Our code is available at https://github.com/shiyanbei306/TCDM.

Article activity feed