GAdaBoost: Accelerating Adaboost Feature Selection with Genetic Algorithms

09/20/2016
by   Mai Tolba, et al.
0

Boosted cascade of simple features, by Viola and Jones, is one of the most famous object detection frameworks. However, it suffers from a lengthy training process. This is due to the vast features space and the exhaustive search nature of Adaboost. In this paper we propose GAdaboost: a Genetic Algorithm to accelerate the training procedure through natural feature selection. Specifically, we propose to limit Adaboost search within a subset of the huge feature space, while evolving this subset following a Genetic Algorithm. Experiments demonstrate that our proposed GAdaboost is up to 3.7 times faster than Adaboost. We also demonstrate that the price of this speedup is a mere decrease (3 detection set, and Caltech Web Faces respectively.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset