Federated Learning Under Statistical Heterogeneity on Riemannian Manifolds

被引:0
|
作者
Ahmad, Adnan [1 ]
Luo, Wei [1 ]
Robles-Kelly, Antonio [1 ,2 ]
机构
[1] Deakin Univ, Sch Informat Technol, Geelong, Vic 3220, Australia
[2] Def Sci & Technol Grp, Edinburgh, SA 5111, Australia
关键词
D O I
10.1007/978-3-031-33374-3_30
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Federated learning (FL) is a collaborative machine learning paradigm in which clients with limited data collaborate to train a single "best" global model based on consensus. One major challenge facing FL is the statistical heterogeneity among the data for each of the local clients. Clients trained with non-IID or imbalanced data whose models are aggregated using averaging schemes such as FedAvg may result in a biased global model with a slow training convergence. To address this challenge, we propose a novel and robust aggregation scheme, FedMan, which assigns each client a weighting factor based on its statistical consistency with other clients. Such statistical consistency is measured on a Riemannian manifold spanned by the covariance of the local client output logits. We demonstrate the superior performance of FedMAN over several FL baselines (FedAvg, FedProx, and Fedcurv) as applied to various benchmark datasets (MNIST, Fashion-MNIST, and CIFAR-10) under a wide variety of degrees of statistical heterogeneity.
引用
收藏
页码:380 / 392
页数:13
相关论文
共 50 条
  • [41] Riemannian Neural SDE: Learning Stochastic Representations on Manifolds
    Park, Sung Woo
    Kim, Hyomin
    Lee, Kyungjae
    Kwon, Junseok
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35 (NEURIPS 2022), 2022,
  • [42] SVM LEARNING AND Lp APPROXIMATION BY GAUSSIANS ON RIEMANNIAN MANIFOLDS
    Ye, Gui-Bo
    Zhou, Ding-Xuan
    ANALYSIS AND APPLICATIONS, 2009, 7 (03) : 309 - 339
  • [43] Learning Interaction Kernels for Agent Systems on Riemannian Manifolds
    Maggioni, Mauro
    Miller, Jason J.
    Qiu, Hongda
    Zhong, Ming
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 139, 2021, 139
  • [44] Extended Hamiltonian Learning on Riemannian Manifolds: Theoretical Aspects
    Fiori, Simone
    IEEE TRANSACTIONS ON NEURAL NETWORKS, 2011, 22 (05): : 687 - 700
  • [45] An Efficient and Security Federated Learning for Data Heterogeneity
    Gao, Junchen
    Ning, Zhenhu
    Cui, Meili
    Xing, Shuaikun
    2024 4TH INTERNATIONAL CONFERENCE ON INFORMATION COMMUNICATION AND SOFTWARE ENGINEERING, ICICSE 2024, 2024, : 1 - 5
  • [46] Fed-QSSL: A Framework for Personalized Federated Learning under Bitwidth and Data Heterogeneity
    Chen, Yiyue
    Vikalo, Haris
    Wang, Chianing
    THIRTY-EIGHTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, VOL 38 NO 10, 2024, : 11443 - 11452
  • [47] RIEMANNIAN SVRG WITH BARZILAI-BORWEIN SCHEME FOR FEDERATED LEARNING
    Xiao, He
    Yan, Tao
    Zhao, Shimin
    JOURNAL OF INDUSTRIAL AND MANAGEMENT OPTIMIZATION, 2025, 21 (02) : 1546 - 1567
  • [48] Privacy preserving federated learning for full heterogeneity
    Chen, Kongyang
    Zhang, Xiaoxue
    Zhou, Xiuhua
    Mi, Bing
    Xiao, Yatie
    Zhou, Lei
    Wu, Zhen
    Wu, Lin
    Wang, Xiaoying
    ISA TRANSACTIONS, 2023, 141 : 73 - 83
  • [49] Mode Connectivity in Federated Learning with Data Heterogeneity
    Zhou, Tailin
    Zhang, Jun
    Tsang, Danny H. K.
    FIFTY-SEVENTH ASILOMAR CONFERENCE ON SIGNALS, SYSTEMS & COMPUTERS, IEEECONF, 2023, : 1600 - 1604
  • [50] Heterogeneity-aware fair federated learning
    Li, Xiaoli
    Zhao, Siran
    Chen, Chuan
    Zheng, Zibin
    INFORMATION SCIENCES, 2023, 619 : 968 - 986