Private Convex Optimization in General Norms

07/18/2022
by   Sivakanth Gopi, et al.
0

We propose a new framework for differentially private optimization of convex functions which are Lipschitz in an arbitrary norm ·. Our algorithms are based on a regularized exponential mechanism which samples from the density ∝exp(-k(F+μ r)) where F is the empirical loss and r is a regularizer which is strongly convex with respect to ·, generalizing a recent work of <cit.> to non-Euclidean settings. We show that this mechanism satisfies Gaussian differential privacy and solves both DP-ERM (empirical risk minimization) and DP-SCO (stochastic convex optimization), by using localization tools from convex geometry. Our framework is the first to apply to private convex optimization in general normed spaces, and directly recovers non-private SCO rates achieved by mirror descent, as the privacy parameter →∞. As applications, for Lipschitz optimization in ℓ_p norms for all p ∈ (1, 2), we obtain the first optimal privacy-utility tradeoffs; for p = 1, we improve tradeoffs obtained by the recent works <cit.> by at least a logarithmic factor. Our ℓ_p norm and Schatten-p norm optimization frameworks are complemented with polynomial-time samplers whose query complexity we explicitly bound.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset