Splintering with distributions: A stochastic decoy scheme for private computation

07/06/2020
by   Praneeth Vepakomma, et al.
11

Performing computations while maintaining privacy is an important problem in todays distributed machine learning solutions. Consider the following two set ups between a client and a server, where in setup i) the client has a public data vector 𝐱, the server has a large private database of data vectors ℬ and the client wants to find the inner products ⟨𝐱,𝐲_𝐤⟩, ∀𝐲_𝐤∈ℬ. The client does not want the server to learn 𝐱 while the server does not want the client to learn the records in its database. This is in contrast to another setup ii) where the client would like to perform an operation solely on its data, such as computation of a matrix inverse on its data matrix 𝐌, but would like to use the superior computing ability of the server to do so without having to leak 𝐌 to the server. We present a stochastic scheme for splitting the client data into privatized shares that are transmitted to the server in such settings. The server performs the requested operations on these shares instead of on the raw client data at the server. The obtained intermediate results are sent back to the client where they are assembled by the client to obtain the final result.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset