Optimality of the Subgradient Algorithm in the Stochastic Setting

09/10/2019
by   Daron Anderson, et al.
0

Recently Jaouad Mourtada and Stéphane Gaïffas showed the anytime hedge algorithm has pseudo-regret O( (d) / Δ) if the cost vectors are generated by an i.i.d sequence in the cube [0,1]^d. Here d is the dimension and Δ the suboptimality gap. This is remarkable because the Hedge algorithm was designed for the antagonistic setting. We prove a similar result for the anytime subgradient algorithm on the simplex. Given i.i.d cost vectors in the unit ball our pseudo-regret bound is O(1/Δ) and does not depend on the dimension of the problem.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset