Over-The-Air Computation for Distributed Machine Learning
Motivated by various applications in distributed Machine Learning (ML) in massive wireless sensor networks, this paper addresses the problem of computing approximate values of functions over the wireless channel and provides examples of applications of our results to distributed training and ML-based prediction. The “over-the-air” computation of a function of data captured at different wireless devices has a huge potential for reducing the communication cost, which is needed for example for training of ML models. It is of particular interest to massive wireless scenarios because, as shown in this paper, its communication cost forntraining scales more favorable with the number of devices than that of traditional schemes that reconstruct all the data. We consider noisy fast-fading channels that pose major challenges to the “over-the-air” computation. As a result, function values are approximated from superimposed noisy signals transmitted by different devices. The fading and noise processes are not limited to Gaussian distributions, and are assumed to be in the more general class of sub-gaussian distributions. Our result does not assume necessarily independent fading and noise, thus allowing for correlations over time and between devices.
READ FULL TEXT