QuantNAS for super resolution: searching for efficient quantization-friendly architectures against quantization noise

08/31/2022
by   Egor Shvetsov, et al.
0

There is a constant need for high-performing and computationally efficient neural network models for image super-resolution (SR) often used on low-capacity devices. One way to obtain such models is to compress existing architectures, e.g. quantization. Another option is a neural architecture search (NAS) that discovers new efficient solutions. We propose a novel quantization-aware NAS procedure for a specifically designed SR search space. Our approach performs NAS to find quantization-friendly SR models. The search relies on adding quantization noise to parameters and activations instead of quantizing parameters directly. Our QuantNAS finds architectures with better PSNR/BitOps trade-off than uniform or mixed precision quantization of fixed architectures. Additionally, our search against noise procedure is up to 30 faster than directly quantizing weights.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset