GAN-Knowledge Distillation for one-stage Object Detection

06/20/2019
by   Wei Hong, et al.
0

Convolutional neural networks have a significant improvement in the accuracy of target detection. As convolutional neural networks become deeper, the accuracy of detection is also obviously improved, and more floating-point calculations are needed. Many researchers use the knowledge distillation method to improve the accuracy of student networks by transferring knowledge from a deeper and larger teachers network to a small student network, in object detection. Most methods of knowledge distillation need to designed complex cost functions and they are aimed at the two-stage object detection algorithm. This paper proposes a clean and effective knowledge distillation method for the one-stage object detection. The feature maps generated by teacher network and student network are used as true samples and fake samples respectively, and generate adversarial training for both to improve the performance of the student network in one-stage object detection.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset