Gossip Learning as a Decentralized Alternative to Federated Learning

被引:115
|
作者
Hegedus, Istvan [1 ]
Danner, Gabor [1 ]
Jelasity, Mark [1 ,2 ]
机构
[1] Univ Szeged, Szeged, Hungary
[2] MTA SZTE Res Grp Artificial Intelligence, Szeged, Hungary
关键词
D O I
10.1007/978-3-030-22496-7_5
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
Federated learning is a distributed machine learning approach for computing models over data collected by edge devices. Most importantly, the data itself is not collected centrally, but a master-worker architecture is applied where a master node performs aggregation and the edge devices are the workers, not unlike the parameter server approach. Gossip learning also assumes that the data remains at the edge devices, but it requires no aggregation server or any central component. In this empirical study, we present a thorough comparison of the two approaches. We examine the aggregated cost of machine learning in both cases, considering also a compression technique applicable in both approaches. We apply a real churn trace as well collected over mobile phones, and we also experiment with different distributions of the training data over the devices. Surprisingly, gossip learning actually outperforms federated learning in all the scenarios where the training data are distributed uniformly over the nodes, and it performs comparably to federated learning overall.
引用
收藏
页码:74 / 90
页数:17
相关论文
共 50 条
  • [31] EdgeFL: A Lightweight Decentralized Federated Learning Framework
    Zhang, Hongyi
    Bosch, Jan
    Olsson, Helena Hohnstrom
    2024 IEEE 48TH ANNUAL COMPUTERS, SOFTWARE, AND APPLICATIONS CONFERENCE, COMPSAC 2024, 2024, : 556 - 561
  • [32] Decentralized Federated Learning under Communication Delays
    Lee, Na
    Shan, Hangguan
    Song, Meiyan
    Zhou, Yong
    Zhao, Zhongyuan
    Li, Xinyu
    Zhang, Zhaoyang
    2022 IEEE INTERNATIONAL CONFERENCE ON SENSING, COMMUNICATION, AND NETWORKING (SECON WORKSHOPS), 2022, : 37 - 42
  • [33] Migrating Models: A Decentralized View on Federated Learning
    Kiss, Peter
    Horvath, Tomas
    MACHINE LEARNING AND PRINCIPLES AND PRACTICE OF KNOWLEDGE DISCOVERY IN DATABASES, ECML PKDD 2021, PT I, 2021, 1524 : 177 - 191
  • [34] Personalized Decentralized Federated Learning with Knowledge Distillation
    Jeong, Eunjeong
    Kountouris, Marios
    ICC 2023-IEEE INTERNATIONAL CONFERENCE ON COMMUNICATIONS, 2023, : 1982 - 1987
  • [35] When Decentralized Optimization Meets Federated Learning
    Gao, Hongchang
    Thai, My T.
    Wu, Jie
    IEEE NETWORK, 2023, 37 (05): : 233 - 239
  • [36] Decentralized Federated Learning for Electronic Health Records
    Lu, Songtao
    Zhang, Yawen
    Wang, Yunlong
    2020 54TH ANNUAL CONFERENCE ON INFORMATION SCIENCES AND SYSTEMS (CISS), 2020, : 245 - 249
  • [37] TORR: A Lightweight Blockchain for Decentralized Federated Learning
    Ma, Xuyang
    Xu, Du
    IEEE INTERNET OF THINGS JOURNAL, 2024, 11 (01) : 1028 - 1040
  • [38] FedDKD: Federated learning with decentralized knowledge distillation
    Xinjia Li
    Boyu Chen
    Wenlian Lu
    Applied Intelligence, 2023, 53 : 18547 - 18563
  • [39] Federated Learning: Collaborative Machine Learning Across Decentralized Data Sources
    Ramirez, Carlos
    Martinez, Ana
    CINEFORUM, 2024, 65 (03): : 148 - 151
  • [40] Network Gradient Descent Algorithm for Decentralized Federated Learning
    Wu, Shuyuan
    Huang, Danyang
    Wang, Hansheng
    JOURNAL OF BUSINESS & ECONOMIC STATISTICS, 2023, 41 (03) : 806 - 818