Fast Stochastic Variance Reduced ADMM for Stochastic Composition Optimization

05/11/2017
by   Yue Yu, et al.
0

We consider the stochastic composition optimization problem proposed in wang2017stochastic, which has applications ranging from estimation to statistical and machine learning. We propose the first ADMM-based algorithm named com-SVR-ADMM, and show that com-SVR-ADMM converges linearly for strongly convex and Lipschitz smooth objectives, and has a convergence rate of O( S/S), which improves upon the O(S^-4/9) rate in wang2016accelerating when the objective is convex and Lipschitz smooth. Moreover, com-SVR-ADMM possesses a rate of O(1/√(S)) when the objective is convex but without Lipschitz smoothness. We also conduct experiments and show that it outperforms existing algorithms.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset