Privacy is What We Care About: Experimental Investigation of Federated Learning on Edge Devices

11/11/2019
by   Anirban Das, et al.
0

Federated Learning enables training of a general model through edge devices without sending raw data to the cloud. Hence, this approach is attractive for digital health applications, where data is sourced through edge devices and users care about privacy. Here, we report on the feasibility to train deep neural networks on the Raspberry Pi4s as edge devices. A CNN, a LSTM and a MLP were successfully trained on the MNIST data-set. Further, federated learning is demonstrated experimentally on IID and non-IID samples in a parametric study, to benchmark the model convergence. The weight updates from the workers are shared with the cloud to train the general model through federated learning. With the CNN and the non-IID samples a test-accuracy of up to 85 achieved within a training time of 2 minutes, while exchanging less than 10 MB data per device. In addition, we discuss federated learning from an use-case standpoint, elaborating on privacy risks and labeling requirements for the application of emotion detection from sound. Based on the experimental findings, we discuss possible research directions to improve model and system performance. Finally, we provide best practices for a practitioner, considering the implementation of federated learning.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset