Differential Private Discrete Noise Adding Mechanism: Conditions, Properties and Optimization

03/19/2022
by   Shuying Qin, et al.
0

Differential privacy is a standard framework to quantify the privacy loss in the data anonymization process. To preserve differential privacy, a random noise adding mechanism is widely adopted, where the trade-off between data privacy level and data utility is of great concern. The privacy and utility properties for the continuous noise adding mechanism have been well studied. However, the related works are insufficient for the discrete random mechanism on discretely distributed data, e.g., traffic data, health records. This paper focuses on the discrete random noise adding mechanisms. We study the basic differential privacy conditions and properties for the general discrete random mechanisms, as well as the trade-off between data privacy and data utility. Specifically, we derive a sufficient and necessary condition for discrete epsilon-differential privacy and a sufficient condition for discrete (epsilon, delta)-differential privacy, with the numerical estimation of differential privacy parameters. These conditions can be applied to analyze the differential privacy properties for the discrete noise adding mechanisms with various kinds of noises. Then, with the differential privacy guarantees, we propose an optimal discrete epsilon-differential private noise adding mechanism under the utility-maximization framework, where the utility is characterized by the similarity of the statistical properties between the mechanism's input and output. For this setup, we find that the class of the discrete noise probability distributions in the optimal mechanism is Staircase-shaped.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset