Mutual Information Driven Federated Learning

被引:29
|
作者
Uddin, Md Palash [1 ]
Xiang, Yong [1 ]
Lu, Xuequan [1 ]
Yearwood, John [2 ]
Gao, Longxiang [1 ]
机构
[1] Deakin Univ, Deakin Blockchain Innovat Lab, Sch Informat Technol, Geelong, Vic 3220, Australia
[2] Deakin Univ, Sch Informat Technol, Geelong, Vic 3220, Australia
关键词
Data models; Training; Computational modeling; Servers; Mathematical model; Convergence; Analytical models; Distributed learning; federated learning; parallel optimization; data parallelism; information theory; mutual information; communication bottleneck; data heterogeneity; FEATURE-SELECTION;
D O I
10.1109/TPDS.2020.3040981
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
Federated Learning (FL) is an emerging research field that yields a global trained model from different local clients without violating data privacy. Existing FL techniques often ignore the effective distinction between local models and the aggregated global model when doing the client-side weight update, as well as the distinction of local models for the server-side aggregation. In this article, we propose a novel FL approach with resorting to mutual information (MI). Specifically, in client-side, the weight update is reformulated through minimizing the MI between local and aggregated models and employing Negative Correlation Learning (NCL) strategy. In server-side, we select top effective models for aggregation based on the MI between an individual local model and its previous aggregated model. We also theoretically prove the convergence of our algorithm. Experiments conducted on MNIST, CIFAR-10, ImageNet, and the clinical MIMIC-III datasets manifest that our method outperforms the state-of-the-art techniques in terms of both communication and testing performance.
引用
收藏
页码:1526 / 1538
页数:13
相关论文
共 50 条
  • [31] Mutual Information Driven Inverse Consistent Nonlinear Registration
    Tao, Guozhi
    He, Renjie
    Datta, Sushmita
    Narayana, Ponnada A.
    2008 30TH ANNUAL INTERNATIONAL CONFERENCE OF THE IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY, VOLS 1-8, 2008, : 3957 - 3960
  • [32] Mutual Information-driven Pan-sharpening
    Zhou, Man
    Yan, Keyu
    Huang, Jie
    Yang, Zihe
    Fu, Xueyang
    Zhao, Feng
    2022 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR 2022), 2022, : 1788 - 1798
  • [33] Viewpoint-driven simplification using mutual information
    Castello, P.
    Sbert, M.
    Chover, M.
    Feixas, M.
    COMPUTERS & GRAPHICS-UK, 2008, 32 (04): : 451 - 463
  • [34] Mutual Information Analysis in Multimodal Learning Systems
    Hadizadeh, Hadi
    Yeganli, S. Faegheh
    Rashidi, Bahador
    Bajic, Ivan V.
    2024 IEEE 7TH INTERNATIONAL CONFERENCE ON MULTIMEDIA INFORMATION PROCESSING AND RETRIEVAL, MIPR 2024, 2024, : 390 - 395
  • [35] Dissecting Deep Learning NetworksVisualizing Mutual Information
    Fang, Hui
    Wang, Victoria
    Yamaguchi, Motonori
    ENTROPY, 2018, 20 (11)
  • [36] Mutual Information Regularized Offline Reinforcement Learning
    Ma, Xiao
    Kang, Bingyi
    Xu, Zhongwen
    Lin, Min
    Yan, Shuicheng
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36 (NEURIPS 2023), 2023,
  • [37] Learning from examples with quadratic mutual information
    Xu, DX
    Principe, JC
    NEURAL NETWORKS FOR SIGNAL PROCESSING VIII, 1998, : 155 - 164
  • [38] Learning tree network based on mutual information
    Hong, Yinghan
    Mai, Guizhen
    Liu, Zhusong
    Metallurgical and Mining Industry, 2015, 7 (12): : 146 - 154
  • [39] Classification Active Learning Based on Mutual Information
    Sourati, Jamshid
    Akcakaya, Murat
    Dy, Jennifer G.
    Leen, Todd K.
    Erdogmus, Deniz
    ENTROPY, 2016, 18 (02)
  • [40] Optimistic Active Learning using Mutual Information
    Guo, Yuhong
    Greiner, Russ
    20TH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2007, : 823 - 829