MBS: Macroblock Scaling for CNN Model Reduction

09/18/2018
by   Yu-Hsun Lin, et al.
0

We estimate the proper channel (width) scaling of Convolution Neural Networks (CNNs) for model reduction. Unlike the traditional scaling method that reduces every CNN channel width by the same scaling factor, we address each CNN macroblock adaptively depending on its information redundancy measured by our proposed effective flops. Our proposed macroblock scaling (MBS) algorithm can be applied to various CNN architectures to reduce their model size. These applicable models range from compact CNN models such as MobileNet (25.53 reduction, ImageNet) and ShuffleNet (20.74 ones such as ResNet-101 (51.67 reduction, CIFAR-10) with negligible accuracy degradation. MBS also performs better reduction at a much lower cost than does the state-of-the-art optimization-based method. MBS's simplicity and efficiency, its flexibility to work with any CNN model, and its scalability to work with models of any depth makes it an attractive choice for CNN model size reduction.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset