Federated Learning with Cooperating Devices: A Consensus Approach for Massive IoT Networks

12/27/2019
by   Stefano Savazzi, et al.
0

Federated learning (FL) is emerging as a new paradigm to train machine learning models in distributed systems. Rather than sharing, and disclosing, the training dataset with the server, the model parameters (e.g. neural networks weights and biases) are optimized collectively by large populations of interconnected devices, acting as local learners. FL can be applied to power-constrained IoT devices with slow and sporadic connections. In addition, it does not need data to be exported to third parties, preserving privacy. Despite these benefits, a main limit of existing approaches is the centralized optimization which relies on a server for aggregation and fusion of local parameters; this has the drawback of a single point of failure and scaling issues for increasing network size. The paper proposes a fully distributed (or server-less) learning approach: the proposed FL algorithms leverage the cooperation of devices that perform data operations inside the network by iterating local computations and mutual interactions via consensus-based methods. The approach lays the groundwork for integration of FL within 5G and beyond networks characterized by decentralized connectivity and computing, with intelligence distributed over the end-devices. The proposed methodology is verified by experimental datasets collected inside an industrial IoT environment.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset