Differentially Private Obfuscation Mechanisms for Hiding Probability Distributions

12/03/2018
by   Yusuke Kawamoto, et al.
0

We propose a formal model for the privacy of user attributes in terms of differential privacy. In particular, we introduce a notion, called distribution privacy, as the differential privacy for probability distributions. Roughly, a local obfuscation mechanism with distribution privacy perturbs each single input so that the attacker cannot significantly gain any information on the probability distribution of inputs by observing an output of the mechanism. Then we show that existing local obfuscation mechanisms have a limited effect on distribution privacy. For instance, we prove that, to provide distribution privacy w.r.t. the approximate max-divergence (resp. f-divergence), the amount of noise added by the Laplace mechanism should be proportional to the infinite Wasserstein (resp. the Earth mover's) distance between the two distributions we want to make indistinguishable. To provide a stronger level of distribution privacy, we introduce an obfuscation mechanism, called the tupling mechanism, that perturbs a given input and adds random dummy data. Then we apply the tupling mechanism to the protection of user attributes in location based services, and demonstrate by experiments that the tupling mechanism outperforms the popular local (extended) differentially private mechanisms in terms of distribution privacy and utility. Finally, we discuss the relationships among utility, privacy, and the cost of adding dummy data.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset