Mutual Information Driven Federated Learning

被引:29
|
作者
Uddin, Md Palash [1 ]
Xiang, Yong [1 ]
Lu, Xuequan [1 ]
Yearwood, John [2 ]
Gao, Longxiang [1 ]
机构
[1] Deakin Univ, Deakin Blockchain Innovat Lab, Sch Informat Technol, Geelong, Vic 3220, Australia
[2] Deakin Univ, Sch Informat Technol, Geelong, Vic 3220, Australia
关键词
Data models; Training; Computational modeling; Servers; Mathematical model; Convergence; Analytical models; Distributed learning; federated learning; parallel optimization; data parallelism; information theory; mutual information; communication bottleneck; data heterogeneity; FEATURE-SELECTION;
D O I
10.1109/TPDS.2020.3040981
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
Federated Learning (FL) is an emerging research field that yields a global trained model from different local clients without violating data privacy. Existing FL techniques often ignore the effective distinction between local models and the aggregated global model when doing the client-side weight update, as well as the distinction of local models for the server-side aggregation. In this article, we propose a novel FL approach with resorting to mutual information (MI). Specifically, in client-side, the weight update is reformulated through minimizing the MI between local and aggregated models and employing Negative Correlation Learning (NCL) strategy. In server-side, we select top effective models for aggregation based on the MI between an individual local model and its previous aggregated model. We also theoretically prove the convergence of our algorithm. Experiments conducted on MNIST, CIFAR-10, ImageNet, and the clinical MIMIC-III datasets manifest that our method outperforms the state-of-the-art techniques in terms of both communication and testing performance.
引用
收藏
页码:1526 / 1538
页数:13
相关论文
共 50 条
  • [21] Review of Research on Information Security in Federated Learning
    Duan, Xinru
    Chen, Guirong
    Chen, Aiwang
    Chen, Chen
    Ji, Weifeng
    Computer Engineering and Applications, 2024, 60 (03) : 61 - 77
  • [22] Federated Learning via Disentangled Information Bottleneck
    Uddin, Md Palash
    Xiang, Yong
    Lu, Xuequan
    Yearwood, John
    Gao, Longxiang
    IEEE TRANSACTIONS ON SERVICES COMPUTING, 2023, 16 (03) : 1874 - 1889
  • [23] CFLIT: Coexisting Federated Learning and Information Transfer
    Lin, Zehong
    Liu, Hang
    Zhang, Ying-Jun Angela
    IEEE TRANSACTIONS ON WIRELESS COMMUNICATIONS, 2023, 22 (11) : 8436 - 8453
  • [24] Age-of-Information-Aware Federated Learning
    Xu, Yin
    Xiao, Ming-Jun
    Wu, Chen
    Wu, Jie
    Zhou, Jin-Rui
    Sun, He
    JOURNAL OF COMPUTER SCIENCE AND TECHNOLOGY, 2024, 39 (03) : 637 - 653
  • [25] Federated mutual learning: a collaborative machine learning method for heterogeneous data, models, and objectives
    Shen, Tao
    Zhang, Jie
    Jia, Xinkang
    Zhang, Fengda
    Lv, Zheqi
    Kuang, Kun
    Wu, Chao
    Wu, Fei
    FRONTIERS OF INFORMATION TECHNOLOGY & ELECTRONIC ENGINEERING, 2023, 24 (10) : 1390 - 1402
  • [26] FedCML: Federated Clustering Mutual Learning with non-IID Data
    Chen, Zekai
    Wang, Fuyi
    Yu, Shengxing
    Liu, Ximeng
    Zheng, Zhiwei
    EURO-PAR 2023: PARALLEL PROCESSING, 2023, 14100 : 623 - 636
  • [27] Model-Driven Quantum Federated Learning (QFL)
    Moin, Armin
    Badii, Atta
    Challenger, Moharram
    COMPANION PROCEEDINGS OF THE 7TH INTERNATIONAL CONFERENCE ON THE ART, SCIENCE, AND ENGINEERING OF PROGRAMMING, PROGRAMMING 2023, 2023, : 111 - 113
  • [28] Federated-Learning-Driven Radio Access Networks
    Foukalas, Fotis
    IEEE WIRELESS COMMUNICATIONS, 2022, 29 (04) : 48 - 54
  • [29] Cluster knowledge-driven vertical federated learning
    Yin, Zilong
    Zhao, Xiaoli
    Wang, Haoyu
    Zhang, Xin
    Guo, Xin
    Fang, Zhijun
    JOURNAL OF SUPERCOMPUTING, 2024, 80 (14): : 20229 - 20252
  • [30] Federated Learning Driven Secure Internet of Medical Things
    Fan, Junqiao
    Wang, Xuehe
    Guo, Yanxiang
    Hu, Xiping
    Hu, Bin
    IEEE WIRELESS COMMUNICATIONS, 2022, 29 (02) : 68 - 75