A^2-FPN: Attention Aggregation based Feature Pyramid Network for Instance Segmentation

05/07/2021
by   Miao Hu, et al.
0

Learning pyramidal feature representations is crucial for recognizing object instances at different scales. Feature Pyramid Network (FPN) is the classic architecture to build a feature pyramid with high-level semantics throughout. However, intrinsic defects in feature extraction and fusion inhibit FPN from further aggregating more discriminative features. In this work, we propose Attention Aggregation based Feature Pyramid Network (A^2-FPN), to improve multi-scale feature learning through attention-guided feature aggregation. In feature extraction, it extracts discriminative features by collecting-distributing multi-level global context features, and mitigates the semantic information loss due to drastically reduced channels. In feature fusion, it aggregates complementary information from adjacent features to generate location-wise reassembly kernels for content-aware sampling, and employs channel-wise reweighting to enhance the semantic consistency before element-wise addition. A^2-FPN shows consistent gains on different instance segmentation frameworks. By replacing FPN with A^2-FPN in Mask R-CNN, our model boosts the performance by 2.1 ResNet-101 as backbone, respectively. Moreover, A^2-FPN achieves an improvement of 2.0 Cascade Mask R-CNN and Hybrid Task Cascade.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset