Collaborative Machine Learning at the Wireless Edge with Blind Transmitters

07/08/2019
by   Mohammad Mohammadi Amiri, et al.
0

We study wireless collaborative machine learning (ML), where mobile edge devices, each with its own dataset, carry out distributed stochastic gradient descent (DSGD) over-the-air with the help of a wireless access point acting as the parameter server (PS). At each iteration of the DSGD algorithm wireless devices compute gradient estimates with their local datasets, and send them to the PS over a wireless fading multiple access channel (MAC). Motivated by the additive nature of the wireless MAC, we propose an analog DSGD scheme, in which the devices transmit scaled versions of their gradient estimates in an uncoded fashion. We assume that the channel state information (CSI) is available only at the PS. We instead allow the PS to employ multiple antennas to alleviate the destructive fading effect, which cannot be cancelled by the transmitters due to the lack of CSI. Theoretical analysis indicates that, with the proposed DSGD scheme, increasing the number of PS antennas mitigates the fading effect, and, in the limit, the effects of fading and noise disappear, and the PS receives aligned signals used to update the model parameter. The theoretical results are then corroborated with the experimental ones.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/19/2020

Blind Federated Edge Learning

We study federated edge learning (FEEL), where wireless edge devices, ea...
research
10/01/2020

Machine Learning at Wireless Edge with OFDM and Low Resolution ADC and DAC

We study collaborative machine learning (ML) systems where a massive dat...
research
01/03/2019

Machine Learning at the Wireless Edge: Distributed Stochastic Gradient Descent Over-the-Air

We study collaborative machine learning at the wireless edge, where powe...
research
05/19/2021

Communication-Efficient Distributed SGD using Preamble-based Random Access

In this paper, we study communication-efficient distributed stochastic g...
research
07/06/2020

Over-The-Air Computation for Distributed Machine Learning

Motivated by various applications in distributed Machine Learning (ML) i...
research
11/19/2022

Non-Coherent Over-the-Air Decentralized Stochastic Gradient Descent

This paper proposes a Decentralized Stochastic Gradient Descent (DSGD) a...
research
09/14/2021

Bayesian AirComp with Sign-Alignment Precoding for Wireless Federated Learning

In this paper, we consider the problem of wireless federated learning ba...

Please sign up or login with your details

Forgot password? Click here to reset