Topology Design and Graph Embedding for Decentralized Federated Learning

被引:0
|
作者
Duan Y. [1 ]
Li X. [1 ]
Wu J. [1 ]
机构
[1] Temple University, Department of Computer and Information Sciences, Philadelphia, 19122, PA
来源
Intelligent and Converged Networks | 2024年 / 5卷 / 02期
关键词
data heterogeneity; decentralized federated learning; graph embedding; network topology;
D O I
10.23919/ICN.2024.0008
中图分类号
学科分类号
摘要
Federated learning has been widely employed in many applications to protect the data privacy of participating clients. Although the dataset is decentralized among training devices in federated learning, the model parameters are usually stored in a centralized manner. Centralized federated learning is easy to implement; however, a centralized scheme causes a communication bottleneck at the central server, which may significantly slow down the training process. To improve training efficiency, we investigate the decentralized federated learning scheme. The decentralized scheme has become feasible with the rapid development of device-to-device communication techniques under 5G. Nevertheless, the convergence rate of learning models in the decentralized scheme depends on the network topology design. We propose optimizing the topology design to improve training efficiency for decentralized federated learning, which is a non-trivial problem, especially when considering data heterogeneity. In this paper, we first demonstrate the advantage of hypercube topology and present a hypercube graph construction method to reduce data heterogeneity by carefully selecting neighbors of each training device - a process that resembles classic graph embedding. In addition, we propose a heuristic method for generating torus graphs. Moreover, we have explored the communication patterns in hypercube topology and propose a sequential synchronization scheme to reduce communication cost during training. A batch synchronization scheme is presented to fine-tune the communication pattern for hypercube topology. Experiments on real-world datasets show that our proposed graph construction methods can accelerate the training process, and our sequential synchronization scheme can significantly reduce the overall communication traffic during training. © 2020 Tsinghua University Press.
引用
收藏
页码:100 / 115
页数:15
相关论文
共 50 条
  • [1] GCN-Based Topology Design for Decentralized Federated Learning in IoV
    Li, Yupeng
    Xie, Qi
    Wang, Weixu
    Zhou, Xiaobo
    Li, Keqiu
    2022 23RD ASIA-PACIFIC NETWORK OPERATIONS AND MANAGEMENT SYMPOSIUM (APNOMS 2022), 2022, : 181 - 186
  • [2] Graph Federated Learning Based on the Decentralized Framework
    Liu, Peilin
    Tang, Yanni
    Zhang, Mingyue
    Chen, Wu
    ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING, ICANN 2023, PT III, 2023, 14256 : 452 - 463
  • [3] IAB Topology Design: A Graph Embedding and Deep Reinforcement Learning Approach
    Simsek, Meryem
    Orhan, Oner
    Nassar, Marcel
    Elibol, Oguz
    Nikopour, Hosein
    IEEE COMMUNICATIONS LETTERS, 2021, 25 (02) : 489 - 493
  • [4] Impact of network topology on the performance of Decentralized Federated Learning
    Palmieri, Luigi
    Boldrini, Chiara
    Valerio, Lorenzo
    Passarella, Andrea
    Conti, Marco
    COMPUTER NETWORKS, 2024, 253
  • [5] Enhancing Decentralized and Personalized Federated Learning With Topology Construction
    Chen, Suo
    Xu, Yang
    Xu, Hongli
    Ma, Zhenguo
    Wang, Zhiyuan
    IEEE TRANSACTIONS ON MOBILE COMPUTING, 2024, 23 (10) : 9692 - 9707
  • [6] Decentralized Federated Graph Learning via Surrogate Model
    Zhang, Bolin
    Gu, Ruichun
    Liu, Haiying
    CMC-COMPUTERS MATERIALS & CONTINUA, 2025, 82 (02): : 2521 - 2535
  • [7] Decentralized Graph Federated Multitask Learning for Streaming Data
    Gogineni, Vinay Chakravarthi
    Werner, Stefan
    Huang, Yih-Fang
    Kuh, Anthony
    2022 56TH ANNUAL CONFERENCE ON INFORMATION SCIENCES AND SYSTEMS (CISS), 2022, : 101 - 106
  • [8] Impact of Network Topology on the Convergence of Decentralized Federated Learning Systems
    Kavalionak, Hanna
    Carlini, Emanuele
    Dazzi, Patrizio
    Ferrucci, Luca
    Mordacchini, Matteo
    Coppola, Massimo
    26TH IEEE SYMPOSIUM ON COMPUTERS AND COMMUNICATIONS (IEEE ISCC 2021), 2021,
  • [9] Decentralized Aggregation Design and Study of Federated Learning
    Malladi, Venkata
    Li, Yi
    Siddula, Madhuri
    Seoand, Daehee
    Huang, Yan
    2021 IEEE SMARTWORLD, UBIQUITOUS INTELLIGENCE & COMPUTING, ADVANCED & TRUSTED COMPUTING, SCALABLE COMPUTING & COMMUNICATIONS, INTERNET OF PEOPLE, AND SMART CITY INNOVATIONS (SMARTWORLD/SCALCOM/UIC/ATC/IOP/SCI 2021), 2021, : 328 - 337
  • [10] Asynchronous Federated Learning in Decentralized Topology Based on Dynamic Average Consensus
    Chen, Zhikun
    Pan, Jiaqi
    Zhang, Sihai
    IEEE INTERNATIONAL CONFERENCE ON COMMUNICATIONS (ICC 2022), 2022, : 2822 - 2827