Communication-Efficient and Drift-Robust Federated Learning via Elastic Net

10/06/2022
by   Seonhyeong Kim, et al.
0

Federated learning (FL) is a distributed method to train a global model over a set of local clients while keeping data localized. It reduces the risks of privacy and security but faces important challenges including expensive communication costs and client drift issues. To address these issues, we propose FedElasticNet, a communication-efficient and drift-robust FL framework leveraging the elastic net. It repurposes two types of the elastic net regularizers (i.e., ℓ_1 and ℓ_2 penalties on the local model updates): (1) the ℓ_1-norm regularizer sparsifies the local updates to reduce the communication costs and (2) the ℓ_2-norm regularizer resolves the client drift problem by limiting the impact of drifting local updates due to data heterogeneity. FedElasticNet is a general framework for FL; hence, without additional costs, it can be integrated into prior FL techniques, e.g., FedAvg, FedProx, SCAFFOLD, and FedDyn. We show that our framework effectively resolves the communication cost and client drift problems simultaneously.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset