Wireless Federated Distillation for Distributed Edge Learning with Heterogeneous Data

07/05/2019
by   Jin-Hyun Ahn, et al.
0

Cooperative training methods for distributed machine learning typically assume noiseless and ideal communication channels. This work studies some of the opportunities and challenges arising from the presence of wireless communication links. We specifically consider wireless implementations of Federated Learning (FL) and Federated Distillation (FD), as well as of a novel Hybrid Federated Distillation (HFD) scheme. Both digital implementations based on separate source-channel coding and over-the-air computing implementations based on joint source-channel coding are proposed and evaluated over Gaussian multiple-access channels.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset