Momentum Aggregation for Private Non-convex ERM

10/12/2022
by   Hoang Tran, et al.
0

We introduce new algorithms and convergence guarantees for privacy-preserving non-convex Empirical Risk Minimization (ERM) on smooth d-dimensional objectives. We develop an improved sensitivity analysis of stochastic gradient descent on smooth objectives that exploits the recurrence of examples in different epochs. By combining this new approach with recent analysis of momentum with private aggregation techniques, we provide an (ϵ,δ)-differential private algorithm that finds a gradient of norm Õ(d^1/3/(ϵ N)^2/3) in O(N^7/3ϵ^4/3/d^2/3) gradient evaluations, improving the previous best gradient bound of Õ(d^1/4/√(ϵ N)).

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset