ReSQueing Parallel and Private Stochastic Convex Optimization

01/01/2023
by   Yair Carmon, et al.
0

We introduce a new tool for stochastic convex optimization (SCO): a Reweighted Stochastic Query (ReSQue) estimator for the gradient of a function convolved with a (Gaussian) probability density. Combining ReSQue with recent advances in ball oracle acceleration [CJJJLST20, ACJJS21], we develop algorithms achieving state-of-the-art complexities for SCO in parallel and private settings. For a SCO objective constrained to the unit ball in ℝ^d, we obtain the following results (up to polylogarithmic factors). We give a parallel algorithm obtaining optimization error ϵ_opt with d^1/3ϵ_opt^-2/3 gradient oracle query depth and d^1/3ϵ_opt^-2/3 + ϵ_opt^-2 gradient queries in total, assuming access to a bounded-variance stochastic gradient estimator. For ϵ_opt∈ [d^-1, d^-1/4], our algorithm matches the state-of-the-art oracle depth of [BJLLS19] while maintaining the optimal total work of stochastic gradient descent. We give an (ϵ_dp, δ)-differentially private algorithm which, given n samples of Lipschitz loss functions, obtains near-optimal optimization error and makes min(n, n^2ϵ_dp^2 d^-1) + min(n^4/3ϵ_dp^1/3, (nd)^2/3ϵ_dp^-1) queries to the gradients of these functions. In the regime d ≤ n ϵ_dp^2, where privacy comes at no cost in terms of the optimal loss up to constants, our algorithm uses n + (nd)^2/3ϵ_dp^-1 queries and improves recent advancements of [KLL21, AFKT21]. In the moderately low-dimensional setting d ≤√(n)ϵ_dp^3/2, our query complexity is near-linear.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset