Self-Supervised Curriculum-based Class Incremental Learning

Read the full article See related articles

Listed in

This article is not in any list yet, why not save it to one of your lists.
Log in to save this article

Abstract

Class-incremental learning, a sub-field of continual learning, faces catastrophic forgetting, where models forget previous tasks while learning new ones. Existing solutions fall into expansion-based, memory-based, and regularization-based approaches, with limited focus on the latter despite its deployability and efficiency. This paper introduces Self-Supervised Curriculum-based Class Incremental Learning (S 2 C 2 IL), a novel regularization-based algorithm that improves class-incremental performance without external memory or network expansion. S 2 C 2 IL leverages self-supervised learning to extract rich feature representations using a new pretext task based on stochastic label augmentation instead of image augmentation. To prevent pretext task-specific knowledge transfer, the final section of the pre-trained network is excluded in feature transfer. For downstream tasks, a curriculum strategy periodically adjusts the standard deviation of a filter fused with the network. Evaluated on split-CIFAR10, split-CIFAR100, split-SVHN, and split-TinyImageNet, S 2 C 2 IL achieves state-of-the-art results, outperforming existing regularization-based and memory-based class-incremental algorithms.

Article activity feed