FedCiR: Client-Invariant Representation Learning for Federated Non-IID Features

被引:1
|
作者
Li, Zijian [1 ]
Lin, Zehong [1 ]
Shao, Jiawei [1 ]
Mao, Yuyi [2 ]
Zhang, Jun [1 ]
机构
[1] Hong Kong Univ Sci & Technol, Dept Elect & Comp Engn, Hong Kong, Peoples R China
[2] Hong Kong Polytech Univ, Dept Elect & Elect Engn, Hong Kong, Peoples R China
关键词
Training; Representation learning; Feature extraction; Distributed databases; Data models; Mutual information; Servers; federated learning (FL); non-independent and identically distributed (non-IID) data; edge intelligence;
D O I
10.1109/TMC.2024.3376697
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Federated learning (FL) is a distributed learning paradigm that maximizes the potential of data-driven models for edge devices without sharing their raw data. However, devices often have non-independent and identically distributed (non-IID) data, meaning their local data distributions can vary significantly. The heterogeneity in input data distributions across devices, commonly referred to as the feature shift problem, can adversely impact the training convergence and accuracy of the global model. To analyze the intrinsic causes of the feature shift problem, we develop a generalization error bound in FL, which motivates us to propose FedCiR, a client-invariant representation learning framework that enables clients to extract informative and client-invariant features. Specifically, we improve the mutual information term between representations and labels to encourage representations to carry essential classification knowledge, and diminish the mutual information term between the client set and representations conditioned on labels to promote representations of clients to be client-invariant. We further incorporate two regularizers into the FL framework to bound the mutual information terms with an approximate global representation distribution to compensate for the absence of the ground-truth global representation distribution, thus achieving informative and client-invariant feature extraction. To achieve global representation distribution approximation, we propose a data-free mechanism performed by the server without compromising privacy. Extensive experiments demonstrate the effectiveness of our approach in achieving client-invariant representation learning and solving the data heterogeneity issue.
引用
收藏
页码:10509 / 10522
页数:14
相关论文
共 50 条
  • [1] FRAug: Tackling Federated Learning with Non-IID Features via Representation Augmentation
    Chen, Haokun
    Frikha, Ahmed
    Krompass, Denis
    Gu, Jindong
    Tresp, Volker
    2023 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION, ICCV, 2023, : 4826 - 4836
  • [2] Non-IID Federated Learning
    Cao, Longbing
    IEEE INTELLIGENT SYSTEMS, 2022, 37 (02) : 14 - 15
  • [3] CCSF: Clustered Client Selection Framework for Federated Learning in non-IID Data
    Mohamed, Aissa H.
    de Souza, Allan M.
    da Costa, Joahannes B. D.
    Villas, Leandro A.
    Dos Reis, Julio C.
    16TH IEEE/ACM INTERNATIONAL CONFERENCE ON UTILITY AND CLOUD COMPUTING, UCC 2023, 2023,
  • [4] Stabilizing and improving federated learning with highly non-iid data and client dropout
    Xu, Jian
    Yang, Meilin
    Ding, Wenbo
    Huang, Shao-Lun
    APPLIED INTELLIGENCE, 2025, 55 (03)
  • [5] Exploring personalization via federated representation Learning on non-IID data
    Jing, Changxing
    Huang, Yan
    Zhuang, Yihong
    Sun, Liyan
    Xiao, Zhenlong
    Huang, Yue
    Ding, Xinghao
    NEURAL NETWORKS, 2023, 163 : 354 - 366
  • [6] Client Selection for Federated Learning With Non-IID Data in Mobile Edge Computing
    Zhang, Wenyu
    Wang, Xiumin
    Zhou, Pan
    Wu, Weiwei
    Zhang, Xinglin
    IEEE ACCESS, 2021, 9 : 24462 - 24474
  • [7] Federated learning on non-IID data: A survey
    Zhu, Hangyu
    Xu, Jinjin
    Liu, Shiqing
    Jin, Yaochu
    NEUROCOMPUTING, 2021, 465 : 371 - 390
  • [8] Enhancing Federated Learning Robustness Through Clustering Non-IID Features
    Li, Yanli
    Sani, Abubakar Sadiq
    Yuan, Dong
    Bao, Wei
    COMPUTER VISION - ACCV 2022 WORKSHOPS, 2023, 13848 : 45 - 59
  • [9] Adaptive Federated Learning With Non-IID Data
    Zeng, Yan
    Mu, Yuankai
    Yuan, Junfeng
    Teng, Siyuan
    Zhang, Jilin
    Wan, Jian
    Ren, Yongjian
    Zhang, Yunquan
    COMPUTER JOURNAL, 2023, 66 (11): : 2758 - 2772
  • [10] Federated Learning With Taskonomy for Non-IID Data
    Jamali-Rad, Hadi
    Abdizadeh, Mohammad
    Singh, Anuj
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2023, 34 (11) : 8719 - 8730