Authors
Koji Matsuda, Yuya Sasaki, Chuan Xiao, Makoto Onizuka
Publication date
2022
Book
Proceedings of the 2022 SIAM international conference on data mining (SDM)
Pages
459-467
Publisher
Society for Industrial and Applied Mathematics
Description
Federated learning is a distributed machine learning method in which a single server and multiple clients collaboratively build machine learning models without sharing datasets on clients. Numerous methods have been proposed to cope with the data heterogeneity issue in federated learning. Existing solutions require a model architecture tuned by the central server, yet a major technical challenge is that it is difficult to tune the model architecture due to the absence of local data on the central server. In this paper, we propose Federated learning via Model exchange (FedMe), which personalizes models with automatic model architecture tuning during the learning process. The novelty of FedMe lies in its learning process: clients exchange their models for model architecture tuning and model training. First, to optimize the model architectures for local data, clients tune their own personalized models by comparing to …
Total citations
202220232024665
Scholar articles
K Matsuda, Y Sasaki, C Xiao, M Onizuka - Proceedings of the 2022 SIAM international conference …, 2022