Local differential privacy: Elbow effect in optimal density estimation and adaptation over Besov ellipsoids

03/05/2019
by   Cristina Butucea, et al.
0

We address the problem of non-parametric density estimation under the additional constraint that only privatised data are allowed to be published and available for inference. For this purpose, we adopt a recent generalisation of classical minimax theory to the framework of local α-differential privacy and provide a lower bound on the rate of convergence over Besov spaces B^s_pq under mean integrated L^r-risk. This lower bound is deteriorated compared to the standard setup without privacy, and reveals a twofold elbow effect. In order to fulfil the privacy requirement, we suggest adding suitably scaled Laplace noise to empirical wavelet coefficients. Upper bounds within (at most) a logarithmic factor are derived under the assumption that α stays bounded as n increases: A linear but non-adaptive wavelet estimator is shown to attain the lower bound whenever p ≥ r but provides a slower rate of convergence otherwise. An adaptive non-linear wavelet estimator with appropriately chosen smoothing parameters and thresholding is shown to attain the lower bound within a logarithmic factor for all cases.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset
Success!
Error Icon An error occurred

Sign in with Google

×

Use your Google Account to sign in to DeepAI

×

Consider DeepAI Pro