AdaBoost and Forward Stagewise Regression are First-Order Convex Optimization Methods

07/04/2013
by   Robert M. Freund, et al.
0

Boosting methods are highly popular and effective supervised learning methods which combine weak learners into a single accurate model with good statistical performance. In this paper, we analyze two well-known boosting methods, AdaBoost and Incremental Forward Stagewise Regression (FS_ε), by establishing their precise connections to the Mirror Descent algorithm, which is a first-order method in convex optimization. As a consequence of these connections we obtain novel computational guarantees for these boosting methods. In particular, we characterize convergence bounds of AdaBoost, related to both the margin and log-exponential loss function, for any step-size sequence. Furthermore, this paper presents, for the first time, precise computational complexity results for FS_ε.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset