Topology Design and Graph Embedding for Decentralized Federated Learning

被引:0
|
作者
Duan Y. [1 ]
Li X. [1 ]
Wu J. [1 ]
机构
[1] Temple University, Department of Computer and Information Sciences, Philadelphia, 19122, PA
来源
Intelligent and Converged Networks | 2024年 / 5卷 / 02期
关键词
data heterogeneity; decentralized federated learning; graph embedding; network topology;
D O I
10.23919/ICN.2024.0008
中图分类号
学科分类号
摘要
Federated learning has been widely employed in many applications to protect the data privacy of participating clients. Although the dataset is decentralized among training devices in federated learning, the model parameters are usually stored in a centralized manner. Centralized federated learning is easy to implement; however, a centralized scheme causes a communication bottleneck at the central server, which may significantly slow down the training process. To improve training efficiency, we investigate the decentralized federated learning scheme. The decentralized scheme has become feasible with the rapid development of device-to-device communication techniques under 5G. Nevertheless, the convergence rate of learning models in the decentralized scheme depends on the network topology design. We propose optimizing the topology design to improve training efficiency for decentralized federated learning, which is a non-trivial problem, especially when considering data heterogeneity. In this paper, we first demonstrate the advantage of hypercube topology and present a hypercube graph construction method to reduce data heterogeneity by carefully selecting neighbors of each training device - a process that resembles classic graph embedding. In addition, we propose a heuristic method for generating torus graphs. Moreover, we have explored the communication patterns in hypercube topology and propose a sequential synchronization scheme to reduce communication cost during training. A batch synchronization scheme is presented to fine-tune the communication pattern for hypercube topology. Experiments on real-world datasets show that our proposed graph construction methods can accelerate the training process, and our sequential synchronization scheme can significantly reduce the overall communication traffic during training. © 2020 Tsinghua University Press.
引用
收藏
页码:100 / 115
页数:15
相关论文
共 50 条
  • [31] IPLS: A Framework for Decentralized Federated Learning
    Pappas, Christodoulos
    Chatzopoulos, Dimitris
    Lalis, Spyros
    Vavalis, Manolis
    2021 IFIP NETWORKING CONFERENCE AND WORKSHOPS (IFIP NETWORKING), 2021,
  • [32] Communication Topologies for Decentralized Federated Learning
    Doetzer, Michael
    Mao, Yixin
    Diepold, Klaus
    2023 EIGHTH INTERNATIONAL CONFERENCE ON FOG AND MOBILE EDGE COMPUTING, FMEC, 2023, : 232 - 238
  • [33] Decentralized Federated Learning: A Survey and Perspective
    Yuan, Liangqi
    Wang, Ziran
    Sun, Lichao
    Yu, Philip S.
    Brinton, Christopher G.
    IEEE INTERNET OF THINGS JOURNAL, 2024, 11 (21): : 34617 - 34638
  • [34] Quantum Federated Learning With Decentralized Data
    Huang, Rui
    Tan, Xiaoqing
    Xu, Qingshan
    IEEE JOURNAL OF SELECTED TOPICS IN QUANTUM ELECTRONICS, 2022, 28 (04)
  • [35] Decentralized Federated Learning With Unreliable Communications
    Ye, Hao
    Liang, Le
    Li, Geoffrey Ye
    IEEE JOURNAL OF SELECTED TOPICS IN SIGNAL PROCESSING, 2022, 16 (03) : 487 - 500
  • [36] A Semi-Asynchronous Decentralized Federated Learning Framework via Tree-Graph Blockchain
    Zhang, Cheng
    Xu, Yang
    Wu, Xiaowei
    Wang, En
    Jiang, Hongbo
    Zhang, Yaoxue
    IEEE INFOCOM 2024-IEEE CONFERENCE ON COMPUTER COMMUNICATIONS, 2024, : 1121 - 1130
  • [37] SpreadGNN: Decentralized Multi-Task Federated Learning for Graph Neural Networks on Molecular Data
    He, Chaoyang
    Ceyani, Emir
    Balasubramanian, Keshav
    Annavaram, Murali
    Avestimehr, Salman
    THIRTY-SIXTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE / THIRTY-FOURTH CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE / THE TWELVETH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2022, : 6865 - 6873
  • [38] Privatized graph federated learning
    Rizk, Elsa
    Vlaski, Stefan
    Sayed, Ali H.
    EURASIP JOURNAL ON ADVANCES IN SIGNAL PROCESSING, 2023, 2023 (01)
  • [39] Privacy-preserving Decentralized Federated Learning over Time-varying Communication Graph
    Lu, Yang
    Yu, Zhengxin
    Suri, Neeraj
    ACM TRANSACTIONS ON PRIVACY AND SECURITY, 2023, 26 (03)
  • [40] Privatized graph federated learning
    Elsa Rizk
    Stefan Vlaski
    Ali H. Sayed
    EURASIP Journal on Advances in Signal Processing, 2023