Federated Learning Under Statistical Heterogeneity on Riemannian Manifolds

被引:0
|
作者
Ahmad, Adnan [1 ]
Luo, Wei [1 ]
Robles-Kelly, Antonio [1 ,2 ]
机构
[1] Deakin Univ, Sch Informat Technol, Geelong, Vic 3220, Australia
[2] Def Sci & Technol Grp, Edinburgh, SA 5111, Australia
关键词
D O I
10.1007/978-3-031-33374-3_30
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Federated learning (FL) is a collaborative machine learning paradigm in which clients with limited data collaborate to train a single "best" global model based on consensus. One major challenge facing FL is the statistical heterogeneity among the data for each of the local clients. Clients trained with non-IID or imbalanced data whose models are aggregated using averaging schemes such as FedAvg may result in a biased global model with a slow training convergence. To address this challenge, we propose a novel and robust aggregation scheme, FedMan, which assigns each client a weighting factor based on its statistical consistency with other clients. Such statistical consistency is measured on a Riemannian manifold spanned by the covariance of the local client output logits. We demonstrate the superior performance of FedMAN over several FL baselines (FedAvg, FedProx, and Fedcurv) as applied to various benchmark datasets (MNIST, Fashion-MNIST, and CIFAR-10) under a wide variety of degrees of statistical heterogeneity.
引用
收藏
页码:380 / 392
页数:13
相关论文
共 50 条
  • [31] TailorFL: Dual-Personalized Federated Learning under System and Data Heterogeneity
    Deng, Yongheng
    Chen, Weining
    Ren, Ju
    Lyu, Feng
    Liu, Yang
    Liu, Yunxin
    Zhang, Yaoxue
    PROCEEDINGS OF THE TWENTIETH ACM CONFERENCE ON EMBEDDED NETWORKED SENSOR SYSTEMS, SENSYS 2022, 2022, : 592 - 606
  • [32] Statistical Computing on Manifolds: From Riemannian Geometry to Computational Anatomy
    Pennec, Xavier
    EMERGING TRENDS IN VISUAL COMPUTING, 2009, 5416 : 347 - 386
  • [33] ATHENA-FL: Avoiding Statistical Heterogeneity with One-versus-All in Federated Learning
    de Souza, Lucas Airam C.
    Camilo, Gustavo F.
    Rebello, Gabriel Antonio F.
    Sammarco, Matteo
    Campista, Miguel Elias M.
    Costa, Luis Henrique M. K.
    JOURNAL OF INTERNET SERVICES AND APPLICATIONS, 2024, 15 (01) : 273 - 288
  • [34] FedSkip: Combatting Statistical Heterogeneity with Federated Skip Aggregation
    Fan, Ziqing
    Wang, Yanfeng
    Yao, Jiangchao
    Lyu, Lingjuan
    Zhang, Ya
    Tian, Qi
    2022 IEEE INTERNATIONAL CONFERENCE ON DATA MINING (ICDM), 2022, : 131 - 140
  • [35] Bio-Inspired Dual-Network Model to Tackle Statistical Heterogeneity in Federated Learning
    Ahmad, Adnan
    Chau, Vinh Loi
    Robles-Kelly, Antonio
    Gao, Shang
    Gao, Longxiang
    Chi, Lianhua
    Luo, Wei
    2023 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, IJCNN, 2023,
  • [36] Learning on dynamic statistical manifolds
    Boso, F.
    Tartakovsky, D. M.
    PROCEEDINGS OF THE ROYAL SOCIETY A-MATHEMATICAL PHYSICAL AND ENGINEERING SCIENCES, 2020, 476 (2239):
  • [37] STATISTICAL INFERENCE FOR DECENTRALIZED FEDERATED LEARNING
    Gu, Jia
    Chen, Song xi
    ANNALS OF STATISTICS, 2024, 52 (06): : 2931 - 2955
  • [38] Bayesian Optimization Meets Riemannian Manifolds in Robot Learning
    Jaquier, Noemie
    Rozo, Leonel
    Calinon, Sylvain
    Burger, Mathias
    CONFERENCE ON ROBOT LEARNING, VOL 100, 2019, 100
  • [39] Extended Hamiltonian Learning on Riemannian Manifolds: Numerical Aspects
    Fiori, Simone
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2012, 23 (01) : 7 - 21
  • [40] No-regret Online Learning over Riemannian Manifolds
    Wang, Xi
    Tu, Zhipeng
    Hong, Yiguang
    Wu, Yingyi
    Shi, Guodong
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021, 34