Data-Free Pruning of CNN Using Kernel Similarity

Read the full article See related articles

Listed in

This article is not in any list yet, why not save it to one of your lists.
Log in to save this article

Abstract

Channel pruning can effectively compress Convolutional Neural Networks (CNNs) for deployment on edge devices. Most existing pruning methods are data-driven, relying heavily on datasets and necessitating fine-tuning the pruned models for several epochs. However, data privacy protection increases the difficulty of getting a dataset, making data inaccessible in some scenarios. Inaccessible datasets lead to current pruning methods infeasible. To solve this issue, we propose a data-free CNN pruning method that does not require data. It involves filter reconstruction and feature reconstruction. To reduce kernels in each filter, we group the kernels in each filter based on the similarity of kernels and calculate a representative kernel for each group to reconstruct the filters. During inference, we conduct feature reconstruction to match input channels of the reconstructed filter so as to satisfy the operational criteria of convolutional neural networks. We validate the effectiveness of our method through extensive experiments using ResNet, MobileNet, and VGG on CIFAR-10 and ImageNet datasets. For ResNet-50, we obtain FLOPs reduction of 56.2% with only Top-1 accuracy reduction of 0.52% on ImageNet.

Article activity feed