BARS: Joint Search of Cell Topology and Layout for Accurate and Efficient Binary ARchitectures
Binary Neural Networks (BNNs) have received significant attention due to their promising efficiency. Currently, most BNN studies directly adopt widely-used CNN architectures, which can be suboptimal for BNNs. This paper proposes a novel Binary ARchitecture Search (BARS) flow to discover superior binary architecture in a large design space. Specifically, we design a two-level (Macro & Micro) search space tailored for BNNs and apply a differentiable neural architecture search (NAS) to explore this search space efficiently. The macro-level search space includes depth and width decisions, which is required for better balancing the model performance and capacity. And we also make modifications to the micro-level search space to strengthen the information flow for BNN. A notable challenge of BNN architecture search lies in that binary operations exacerbate the "collapse" problem of differentiable NAS, and we incorporate various search and derive strategies to stabilize the search process. On CIFAR-10, achieves 1.5% higher accuracy with 2/3 binary Ops and 1/10 floating-point Ops. On ImageNet, with similar resource consumption, -discovered architecture achieves 3% accuracy gain than hand-crafted architectures, while removing the full-precision downsample layer.
READ FULL TEXT