論文

2020年12月

Federated Learning of Neural Network Models with Heterogeneous Structures

Proceedings - 19th IEEE International Conference on Machine Learning and Applications, ICMLA 2020
  • Kundjanasith Thonglek
  • ,
  • Keichi Takahashi
  • ,
  • Kohei Ichikawa
  • ,
  • Hajimu Iida
  • ,
  • Chawanat Nakasan

開始ページ
735
終了ページ
740
記述言語
掲載種別
研究論文(国際会議プロシーディングス)
DOI
10.1109/ICMLA51294.2020.00120

Federated learning trains a model on a centralized server using datasets distributed over a large number of edge devices. Applying federated learning ensures data privacy because it does not transfer local data from edge devices to the server. Existing federated learning algorithms assume that all deployed models share the same structure. However, it is often infeasible to distribute the same model to every edge device because of hardware limitations such as computing performance and storage space. This paper proposes a novel federated learning algorithm to aggregate information from multiple heterogeneous models. The proposed method uses weighted average ensemble to combine the outputs from each model. The weight for the ensemble is optimized using black box optimization methods. We evaluated the proposed method using diverse models and datasets and found that it can achieve comparable performance to conventional training using centralized datasets. Furthermore, we compared six different optimization methods to tune the weights for the weighted average ensemble and found that tree parzen estimator achieves the highest accuracy among the alternatives.

リンク情報
DOI
https://doi.org/10.1109/ICMLA51294.2020.00120
Scopus
https://www.scopus.com/inward/record.uri?partnerID=HzOxMe3b&scp=85102519049&origin=inward
Scopus Citedby
https://www.scopus.com/inward/citedby.uri?partnerID=HzOxMe3b&scp=85102519049&origin=inward
URL
https://dblp.uni-trier.de/conf/icmla/2020
URL
https://dblp.uni-trier.de/db/conf/icmla/icmla2020.html#ThonglekTIIN20

エクスポート
BibTeX RIS