Authors
Shiqiang Wang, Tiffany Tuor, Theodoros Salonidis, Kin K Leung, Christian Makaya, Ting He, Kevin Chan
Publication date
2018/4/16
Conference
IEEE INFOCOM 2018-IEEE conference on computer communications
Pages
63-71
Publisher
IEEE
Description
Emerging technologies and applications including Internet of Things (IoT), social networking, and crowd-sourcing generate large amounts of data at the network edge. Machine learning models are often built from the collected data, to enable the detection, classification, and prediction of future events. Due to bandwidth, storage, and privacy concerns, it is often impractical to send all the data to a centralized location. In this paper, we consider the problem of learning model parameters from data distributed across multiple edge nodes, without sending raw data to a centralized place. Our focus is on a generic class of machine learning models that are trained using gradient-descent based approaches. We analyze the convergence rate of distributed gradient descent from a theoretical point of view, based on which we propose a control algorithm that determines the best trade-off between local update and global …
Total citations
20182019202020212022202320248621091081079849
Scholar articles
S Wang, T Tuor, T Salonidis, KK Leung, C Makaya… - IEEE INFOCOM 2018-IEEE conference on computer …, 2018