Authors
Pengchao Han, Shiqiang Wang, Kin K Leung
Publication date
2020/11/29
Conference
2020 IEEE 40th International Conference on Distributed Computing Systems (ICDCS)
Pages
300-310
Publisher
IEEE
Description
Federated learning (FL) is an emerging technique for training machine learning models using geographically dispersed data collected by local entities. It includes local computation and synchronization steps. To reduce the communication overhead and improve the overall efficiency of FL, gradient sparsification (GS) can be applied, where instead of the full gradient, only a small subset of important elements of the gradient is communicated. Existing work on GS uses a fixed degree of gradient sparsity for i.i.d.-distributed data within a datacenter. In this paper, we consider adaptive degree of sparsity and non-i.i.d. local datasets. We first present a fairness-aware GS method which ensures that different clients provide a similar amount of updates. Then, with the goal of minimizing the overall training time, we propose a novel online learning formulation and algorithm for automatically determining the near-optimal …
Total citations
2019202020212022202320241317465938
Scholar articles
P Han, S Wang, KK Leung - 2020 IEEE 40th international conference on distributed …, 2020