Accelerated Stochastic Mirror Descent Algorithms For Composite Non-strongly Convex Optimization
We consider the problem of minimizing the sum of an average function of a large number of smooth convex components and a general, possibly non-differentiable, convex function. Although many methods have been proposed to solve this problem with the assumption that the sum is strongly convex, few methods support the non-strongly convex cases. Adding a small quadratic regularization is a common trick used to tackle non-strongly convex problems; however, it may worsen the quality of solutions or weaken the performance of the algorithms. Avoiding this trick, we propose a new accelerated stochastic mirror descent method for solving the problem without the strongly convex assumption. Our method extends the deterministic accelerated proximal gradient methods of Paul Tseng and can be applied even when proximal points are computed inexactly. Our direct algorithms can be proven to achieve the optimal convergence rate O(1/k^2) under a suitable choice of the errors in calculating the proximal points. We also propose a scheme for solving the problem when the component functions are non-smooth and finally apply the new algorithms to a class of composite convex concave optimization problems.
READ FULL TEXT