Logit Normalization for Long-tail Object Detection

03/31/2022
by   Liang Zhao, et al.
7

Real-world data exhibiting skewed distributions pose a serious challenge to existing object detectors. Moreover, the samplers in detectors lead to shifted training label distributions, while the tremendous proportion of background to foreground samples severely harms foreground classification. To mitigate these issues, in this paper, we propose Logit Normalization (LogN), a simple technique to self-calibrate the classified logits of detectors in a similar way to batch normalization. In general, our LogN is training- and tuning-free (i.e. require no extra training and tuning process), model- and label distribution-agnostic (i.e. generalization to different kinds of detectors and datasets), and also plug-and-play (i.e. direct application without any bells and whistles). Extensive experiments on the LVIS dataset demonstrate superior performance of LogN to state-of-the-art methods with various detectors and backbones. We also provide in-depth studies on different aspects of our LogN. Further experiments on ImageNet-LT reveal its competitiveness and generalizability. Our LogN can serve as a strong baseline for long-tail object detection and is expected to inspire future research in this field. Code and trained models will be publicly available at https://github.com/MCG-NJU/LogN.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset