KCP: Kernel Cluster Pruning for Dense Labeling Neural Networks

01/17/2021
by   Po-Hsiang Yu, et al.
17

Pruning has become a promising technique used to compress and accelerate neural networks. Existing methods are mainly evaluated on spare labeling applications. However, dense labeling applications are those closer to real world problems that require real-time processing on resource-constrained mobile devices. Pruning for dense labeling applications is still a largely unexplored field. The prevailing filter channel pruning method removes the entire filter channel. Accordingly, the interaction between each kernel in one filter channel is ignored. In this study, we proposed kernel cluster pruning (KCP) to prune dense labeling networks. We developed a clustering technique to identify the least representational kernels in each layer. By iteratively removing those kernels, the parameter that can better represent the entire network is preserved; thus, we achieve better accuracy with a decent model size and computation reduction. When evaluated on stereo matching and semantic segmentation neural networks, our method can reduce more than 70 drop. Moreover, for ResNet-50 on ILSVRC-2012, our KCP can reduce more than 50 of FLOPs reduction with 0.13 state-of-the-art pruning results.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset