Neural Architecture Ranker

01/30/2022
by   Guo Bicheng, Brandon, et al.
17

Architecture ranking has recently been advocated to design an efficient and effective performance predictor for Neural Architecture Search (NAS). The previous contrastive method solves the ranking problem by comparing pairs of architectures and predicting their relative performance, which may suffer generalization issues due to local pair-wise comparison. Inspired by the quality stratification phenomenon in the search space, we propose a predictor, namely Neural Architecture Ranker (NAR), from a new and global perspective by exploiting the quality distribution of the whole search space. The NAR learns the similar characteristics of the same quality tier (i.e., level) and distinguishes among different individuals by first matching architectures with the representation of tiers, and then classifying and scoring them. It can capture the features of different quality tiers and thus generalize its ranking ability to the entire search space. Besides, distributions of different quality tiers are also beneficial to guide the sampling procedure, which is free of training a search algorithm and thus simplifies the NAS pipeline. The proposed NAR achieves better performance than the state-of-the-art methods on two widely accepted datasets. On NAS-Bench-101, it finds the architectures with top 0.01x2030 performance among the search space and stably focuses on the top architectures. On NAS-Bench-201, it identifies the optimal architectures on CIFAR-10, CIFAR-100 and, ImageNet-16-120. We expand and release these two datasets covering detailed cell computational information to boost the study of NAS.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset