Privacy-Preserving Coded Mobile Edge Computing for Low-Latency Distributed Inference

10/07/2021
by   Reent Schlegel, et al.
0

We consider a mobile edge computing scenario where a number of devices want to perform a linear inference Wx on some local data x given a network-side matrix W. The computation is performed at the network edge over a number of edge servers. We propose a coding scheme that provides information-theoretic privacy against z colluding (honest-but-curious) edge servers, while minimizing the overall latency—comprising upload, computation, download, and decoding latency—in the presence of straggling servers. The proposed scheme exploits Shamir's secret sharing to yield data privacy and straggler mitigation, combined with replication to provide spatial diversity for the download. We also propose two variants of the scheme that further reduce latency. For a considered scenario with 9 edge servers, the proposed scheme reduces the latency by 8% compared to the nonprivate scheme recently introduced by Zhang and Simeone, while providing privacy against an honest-but-curious edge server.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset