D-Cliques: Compensating for Data Heterogeneity with Topology in Decentralized Federated Learning

被引:10
|
作者
Bellet, Aurelien [1 ]
Kermarrec, Anne-Marie [2 ]
Lavoie, Erick [2 ]
机构
[1] INRIA, Lille, France
[2] Ecole Polytech Fed Lausanne, Lausanne, Switzerland
关键词
Decentralized Learning; Federated Learning; Topology; Heterogeneous Data; Stochastic Gradient Descent;
D O I
10.1109/SRDS55811.2022.00011
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
0812 ;
摘要
The convergence speed of machine learning models trained with Federated Learning is significantly affected by heterogeneous data partitions, even more so in a fully decentralized setting without a central server. In this paper, we show that the impact of label distribution skew, an important type of data heterogeneity, can be significantly reduced by carefully designing the underlying communication topology. We present D-Cliques, a novel topology that reduces gradient bias by grouping nodes in sparsely interconnected cliques such that the label distribution in a clique is representative of the global label distribution. We also show how to adapt the updates of decentralized SGD to obtain unbiased gradients and implement an effective momentum with D-Cliques. Our extensive empirical evaluation on MNIST and CIFAR10 validates our design and demonstrates that our approach achieves similar convergence speed as a fully-connected topology, while providing a significant reduction in the number of edges and messages. In a 1000-node topology, D-Cliques require 98% less edges and 96% less total messages, with further possible gains using a small-world topology across cliques.
引用
收藏
页码:1 / 11
页数:11
相关论文
共 50 条
  • [1] Semi-Decentralized Federated Edge Learning With Data and Device Heterogeneity
    Sun, Yuchang
    Shao, Jiawei
    Mao, Yuyi
    Wang, Jessie Hui
    Zhang, Jun
    IEEE TRANSACTIONS ON NETWORK AND SERVICE MANAGEMENT, 2023, 20 (02): : 1487 - 1501
  • [2] Topology Learning for Heterogeneous Decentralized Federated Learning Over Unreliable D2D Networks
    Wu, Zheshun
    Xu, Zenglin
    Zeng, Dun
    Li, Junfan
    Liu, Jie
    IEEE TRANSACTIONS ON VEHICULAR TECHNOLOGY, 2024, 73 (08) : 12201 - 12206
  • [3] Impact of network topology on the performance of Decentralized Federated Learning
    Palmieri, Luigi
    Boldrini, Chiara
    Valerio, Lorenzo
    Passarella, Andrea
    Conti, Marco
    COMPUTER NETWORKS, 2024, 253
  • [4] Topology Design and Graph Embedding for Decentralized Federated Learning
    Duan Y.
    Li X.
    Wu J.
    Intelligent and Converged Networks, 2024, 5 (02): : 100 - 115
  • [5] Enhancing Decentralized and Personalized Federated Learning With Topology Construction
    Chen, Suo
    Xu, Yang
    Xu, Hongli
    Ma, Zhenguo
    Wang, Zhiyuan
    IEEE TRANSACTIONS ON MOBILE COMPUTING, 2024, 23 (10) : 9692 - 9707
  • [6] Quantum Federated Learning With Decentralized Data
    Huang, Rui
    Tan, Xiaoqing
    Xu, Qingshan
    IEEE JOURNAL OF SELECTED TOPICS IN QUANTUM ELECTRONICS, 2022, 28 (04)
  • [7] Impact of Network Topology on the Convergence of Decentralized Federated Learning Systems
    Kavalionak, Hanna
    Carlini, Emanuele
    Dazzi, Patrizio
    Ferrucci, Luca
    Mordacchini, Matteo
    Coppola, Massimo
    26TH IEEE SYMPOSIUM ON COMPUTERS AND COMMUNICATIONS (IEEE ISCC 2021), 2021,
  • [8] HADFL: Heterogeneity-aware Decentralized Federated Learning Framework
    Cao, Jing
    Lian, Zirui
    Liu, Weihong
    Zhu, Zongwei
    Ji, Cheng
    2021 58TH ACM/IEEE DESIGN AUTOMATION CONFERENCE (DAC), 2021, : 1 - 6
  • [9] A decentralized data evaluation framework in federated learning
    Bhatia, Laveen
    Samet, Saeed
    BLOCKCHAIN-RESEARCH AND APPLICATIONS, 2023, 4 (04):
  • [10] Rethinking the Data Heterogeneity in Federated Learning
    Wang, Jiayi
    Wang, Shiqiang
    Chen, Rong-Rong
    Ji, Mingyue
    FIFTY-SEVENTH ASILOMAR CONFERENCE ON SIGNALS, SYSTEMS & COMPUTERS, IEEECONF, 2023, : 624 - 628