A Note on Zeroth-Order Optimization on the Simplex

08/02/2022
by   Tijana Zrnic, et al.
0

We construct a zeroth-order gradient estimator for a smooth function defined on the probability simplex. The proposed estimator queries the simplex only. We prove that projected gradient descent and the exponential weights algorithm, when run with this estimator instead of exact gradients, converge at a 𝒪(T^-1/4) rate.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset