Abstract

Previous works utilized "smaller-norm-less-important" criterion to prune filters with smaller norm values in a convolutional neural network. In this paper, we analyze this norm-based criterion and point out that its effectiveness depends on two requirements that are not always met: (1) the norm deviation of the filters should be large; (2) the minimum norm of the filters should be small. To solve this problem, we propose a novel filter pruning method, namely Filter Pruning via Geometric Median (FPGM), to compress the model regardless of those two requirements. Unlike previous methods, FPGM compresses CNN models by pruning filters with redundancy, rather than those with"relatively less" importance. When applied to two image classification benchmarks, our method validates its usefulness and strengths. Notably, on CIFAR-10, FPGM reduces more than 52% FLOPs on ResNet-110 with even 2.69% relative accuracy improvement. Moreover, on ILSVRC-2012, FPGM reduces more than 42% FLOPs on ResNet-101 without top-5 accuracy drop, which has advanced the state-of-the-art. Code is publicly available on GitHub: https://github.com/he-y/filter-pruning-geometric-median

Keywords

FLOPSComputer scienceConvolutional neural networkRedundancy (engineering)PruningNorm (philosophy)Filter (signal processing)AlgorithmArtificial intelligencePattern recognition (psychology)Computer visionParallel computing

Affiliated Institutions

Related Publications

Publication Info

Year
2019
Type
article
Pages
4335-4344
Citations
1186
Access
Closed

External Links

Social Impact

Social media, news, blog, policy document mentions

Citation Metrics

1186
OpenAlex

Cite This

Yang He, Ping Liu, Ziwei Wang et al. (2019). Filter Pruning via Geometric Median for Deep Convolutional Neural Networks Acceleration. , 4335-4344. https://doi.org/10.1109/cvpr.2019.00447

Identifiers

DOI
10.1109/cvpr.2019.00447