Semi-Supervised Neural Architecture Search

02/24/2020
by   Renqian Luo, et al.
0

Neural architecture search (NAS) relies on a good controller to generate better architectures or predict the accuracy of given architectures. However, training the controller requires both abundant and high-quality pairs of architectures and their accuracy, while it is costly to evaluate an architecture and obtain its accuracy. In this paper, we propose SemiNAS, a semi-supervised NAS approach that leverages numerous unlabeled architectures (without evaluation and thus nearly no cost) to improve the controller. Specifically, SemiNAS 1) trains an initial controller with a small set of architecture-accuracy data pairs; 2) uses the trained controller to predict the accuracy of large amount of architectures (without evaluation); and 3) adds the generated data pairs to the original data to further improve the controller. SemiNAS has two advantages: 1) It reduces the computational cost under the same accuracy guarantee. 2) It achieves higher accuracy under the same computational cost. On NASBench-101 benchmark dataset, it discovers a top 0.01 computational cost compared with regularized evolution and gradient-based methods. On ImageNet, it achieves 24.2 setting) using 4 GPU-days for search. We further apply it to LJSpeech text to speech task and it achieves 97 setting and 15 improvements over the baseline respectively. Our code is available at https://github.com/renqianluo/SemiNAS.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset