Fine-Tuning DARTS for Image Classification

06/16/2020
by   Muhammad Suhaib Tanveer, et al.
0

Neural Architecture Search (NAS) has gained attraction due to superior classification performance. Differential Architecture Search (DARTS) is a computationally light method. To limit computational resources DARTS makes numerous approximations. These approximations result in inferior performance. We propose to fine-tune DARTS using fixed operations as they are independent of these approximations. Our method offers a good trade-off between the number of parameters and classification accuracy. Our approach improves the top-1 accuracy on Fashion-MNIST, CompCars, and MIO-TCD datasets by 0.56 0.39 performs better than DARTS, improving the accuracy by 0.28 4.5 CompCars, and MIO-TCD datasets, respectively.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset