Secure and Differentially Private Bayesian Learning on Distributed Data
Data integration and sharing maximally enhance the potential for novel and meaningful discoveries. However, it is a non-trivial task as integrating data from multiple sources can put sensitive information of study participants at risk. To address the privacy concern, we present a distributed Bayesian learning approach via Preconditioned Stochastic Gradient Langevin Dynamics with RMSprop, which combines differential privacy and homomorphic encryption in a harmonious manner while protecting private information. We applied the proposed secure and privacy-preserving distributed Bayesian learning approach to logistic regression and survival analysis on distributed data, and demonstrated its feasibility in terms of prediction accuracy and time complexity, compared to the centralized approach.
READ FULL TEXT