Is Shapley Value fair? Improving Client Selection for Mavericks in Federated Learning

06/20/2021
by   Jiyue Huang, et al.
0

Shapley Value is commonly adopted to measure and incentivize client participation in federated learning. In this paper, we show – theoretically and through simulations – that Shapley Value underestimates the contribution of a common type of client: the Maverick. Mavericks are clients that differ both in data distribution and data quantity and can be the sole owners of certain types of data. Selecting the right clients at the right moment is important for federated learning to reduce convergence times and improve accuracy. We propose FedEMD, an adaptive client selection strategy based on the Wasserstein distance between the local and global data distributions. As FedEMD adapts the selection probability such that Mavericks are preferably selected when the model benefits from improvement on rare classes, it consistently ensures the fast convergence in the presence of different types of Mavericks. Compared to existing strategies, including Shapley Value-based ones, FedEMD improves the convergence of neural network classifiers by at least 26.9 FedAvg aggregation compared with the state of the art.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset