The Right Complexity Measure in Locally Private Estimation: It is not the Fisher Information

06/14/2018
by   John C. Duchi, et al.
0

We identify fundamental tradeoffs between statistical utility and privacy under local models of privacy in which data is kept private even from the statistician, providing instance-specific bounds for private estimation and learning problems by developing the local minimax risk. In contrast to approaches based on worst-case (minimax) error, which are conservative, this allows us to evaluate the difficulty of individual problem instances and delineate the possibilities for adaptation in private estimation and inference. Our main results show that the local modulus of continuity of the estimand with respect to the variation distance---as opposed to the Hellinger distance central to classical statistics---characterizes rates of convergence under locally private estimation for many notions of privacy, including differential privacy and its relaxations. As consequences of these results, we identify an alternative to the Fisher information for private estimation, giving a more nuanced understanding of the challenges of adaptivity and optimality, and provide new minimax bounds for high-dimensional estimation showing that even interactive locally private procedures suffer poor performance under weak notions of privacy.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset