Authors
Junjie Shi, Jiang Bian, Jakob Richter, Kuan-Hsun Chen, Jörg Rahnenführer, Haoyi Xiong, Jian-Jia Chen
Publication date
2021/6
Journal
Machine Learning
Volume
110
Issue
6
Pages
1527-1547
Publisher
Springer US
Description
The predictive performance of a machine learning model highly depends on the corresponding hyper-parameter setting. Hence, hyper-parameter tuning is often indispensable. Normally such tuning requires the dedicated machine learning model to be trained and evaluated on centralized data to obtain a performance estimate. However, in a distributed machine learning scenario, it is not always possible to collect all the data from all nodes due to privacy concerns or storage limitations. Moreover, if data has to be transferred through low bandwidth connections it reduces the time available for tuning. Model-Based Optimization (MBO) is one state-of-the-art method for tuning hyper-parameters but the application on distributed machine learning models or federated learning lacks research. This work proposes a framework that allows to deploy MBO on resource-constrained distributed embedded systems. Each node trains an …
Total citations
2022202344
Scholar articles
J Shi, J Bian, J Richter, KH Chen, J Rahnenführer… - Machine Learning, 2021