Gossip Learning as a Decentralized Alternative to Federated Learning

被引:115
|
作者
Hegedus, Istvan [1 ]
Danner, Gabor [1 ]
Jelasity, Mark [1 ,2 ]
机构
[1] Univ Szeged, Szeged, Hungary
[2] MTA SZTE Res Grp Artificial Intelligence, Szeged, Hungary
关键词
D O I
10.1007/978-3-030-22496-7_5
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
Federated learning is a distributed machine learning approach for computing models over data collected by edge devices. Most importantly, the data itself is not collected centrally, but a master-worker architecture is applied where a master node performs aggregation and the edge devices are the workers, not unlike the parameter server approach. Gossip learning also assumes that the data remains at the edge devices, but it requires no aggregation server or any central component. In this empirical study, we present a thorough comparison of the two approaches. We examine the aggregated cost of machine learning in both cases, considering also a compression technique applicable in both approaches. We apply a real churn trace as well collected over mobile phones, and we also experiment with different distributions of the training data over the devices. Surprisingly, gossip learning actually outperforms federated learning in all the scenarios where the training data are distributed uniformly over the nodes, and it performs comparably to federated learning overall.
引用
收藏
页码:74 / 90
页数:17
相关论文
共 50 条
  • [1] Decentralized learning works: An empirical comparison of gossip learning and federated learning
    Hegedus, Istvan
    Danner, Gabor
    Jelasity, Mark
    JOURNAL OF PARALLEL AND DISTRIBUTED COMPUTING, 2021, 148 : 109 - 124
  • [2] Decentralized Recommendation Based on Matrix Factorization: A Comparison of Gossip and Federated Learning
    Hegedus, Istvan
    Danner, Gabor
    Jelasity, Mark
    MACHINE LEARNING AND KNOWLEDGE DISCOVERY IN DATABASES, ECML PKDD 2019, PT I, 2020, 1167 : 317 - 332
  • [3] On the Benefits of Multiple Gossip Steps in Communication-Constrained Decentralized Federated Learning
    Hashemi, Abolfazl
    Acharya, Anish
    Das, Rudrajit
    Vikalo, Haris
    Sanghavi, Sujay
    Dhillon, Inderjit
    IEEE TRANSACTIONS ON PARALLEL AND DISTRIBUTED SYSTEMS, 2022, 33 (11) : 2727 - 2739
  • [4] FedDual: Pair-Wise Gossip Helps Federated Learning in Large Decentralized Networks
    Chen, Qian
    Wang, Zilong
    Wang, Hongbo
    Lin, Xiaodong
    IEEE TRANSACTIONS ON INFORMATION FORENSICS AND SECURITY, 2023, 18 : 335 - 350
  • [5] Confederated Learning: Federated Learning With Decentralized Edge Servers
    Wang, Bin
    Fang, Jun
    Li, Hongbin
    Yuan, Xiaojun
    Ling, Qing
    IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2023, 71 : 248 - 263
  • [6] Decentralized Federated Learning with Prototype Exchange
    Qi, Lu
    Chen, Haoze
    Zou, Hongliang
    Chen, Shaohua
    Zhang, Xiaoying
    Chen, Hongyan
    MATHEMATICS, 2025, 13 (02)
  • [7] Fedstellar: A Platform for Decentralized Federated Learning
    Beltran, Enrique Tomas Martinez
    Gomez, angel Luis Perales
    Feng, Chao
    Sanchez, Pedro Miguel
    Bernal, Sergio Lopez
    Bovet, Gerome
    Perez, Manuel Gil
    Perez, Gregorio Martinez
    Celdran, Alberto Huertas
    EXPERT SYSTEMS WITH APPLICATIONS, 2024, 242
  • [8] Towards Efficient Decentralized Federated Learning
    Pappas, Christodoulos
    Papadopoulos, Dimitrios
    Chatzopoulos, Dimitris
    Panagou, Eleni
    Lalis, Spyros
    Vavalis, Manolis
    2022 IEEE 42ND INTERNATIONAL CONFERENCE ON DISTRIBUTED COMPUTING SYSTEMS WORKSHOPS (ICDCSW), 2022, : 79 - 85
  • [9] Communication Topologies for Decentralized Federated Learning
    Doetzer, Michael
    Mao, Yixin
    Diepold, Klaus
    2023 EIGHTH INTERNATIONAL CONFERENCE ON FOG AND MOBILE EDGE COMPUTING, FMEC, 2023, : 232 - 238
  • [10] Decentralized Federated Learning: A Survey and Perspective
    Yuan, Liangqi
    Wang, Ziran
    Sun, Lichao
    Yu, Philip S.
    Brinton, Christopher G.
    IEEE INTERNET OF THINGS JOURNAL, 2024, 11 (21): : 34617 - 34638