Minimum Gaussian Noise Variance of Federated Learning in the Presence of Mutual Information Based Differential Privacy

被引:0
|
作者
He, Hua [1 ,2 ,3 ]
He, Zheng [4 ]
机构
[1] Chongqing Technol & Business Univ, Sch Business Adm, Chongqing 400067, Peoples R China
[2] Geely Univ China, Sch Business, Chengdu 610000, Peoples R China
[3] Krirk Univ, Int Coll, Bangkok 10220, Thailand
[4] Southwest Jiaotong Univ, Sch Informat Sci & Technol, Chengdu 610031, Peoples R China
关键词
Servers; Data privacy; Data models; Training; Privacy; Gaussian noise; Solid modeling; Differential privacy; Federated learning; Mutual information; federated learning; mutual information; privacy-utility trade-off; TRADEOFFS;
D O I
10.1109/ACCESS.2023.3323020
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Federated learning (FL), which not only protects data security and privacy, but also training models on distributed devices, receives much attention in the literature. Traditionally, stochastic gradient descent (SGD) is used in FL since its excellent empirical performance, but user's private information can still be leaked by analyzing weight updates during FL iterations. Differential privacy (DP) is an effective way to solve this privacy leakage problem, which adds noise to the user's data gradient and this artificial noise helps to prevent information leakage. However, note that the SGD based FL with DP is not yet investigated with a comprehensive theoretical analysis considering privacy and data utility jointly, especially from the information-theoretic aspect. In this paper, we investigate the FL in the presence of mutual information based DP (MI-DP). Specifically, first, Gaussian DP mechanism is applied to either clients or central server of the FL model, and privacy and utility of the FL model are characterized by conditional mutual information and distortion, respectively. For a given privacy budget, we establish lower bounds on the variance of the Gaussian noise added to clients or central server of the FL model, and show that the utility of the global model remains the same for both cases. Next, we study the privacy-utility trade-off problem by considering a more general case, where both the model parameter and the privacy requirement of the clients are flexible. A privacy-preserving scheme is proposed, which maximizes the utility of the global model while different privacy requirements of all clients are preserved. Finally, the results of this paper are further explained by experimental results.
引用
收藏
页码:111212 / 111225
页数:14
相关论文
共 50 条
  • [1] A Differential Privacy Federated Learning Scheme Based on Adaptive Gaussian Noise
    Jiao, Sanxiu
    Cai, Lecai
    Wang, Xinjie
    Cheng, Kui
    Gao, Xiang
    CMES-COMPUTER MODELING IN ENGINEERING & SCIENCES, 2024, 138 (02): : 1679 - 1694
  • [2] A Differential Privacy Strategy Based on Local Features of Non-Gaussian Noise in Federated Learning
    Wang, Xinyi
    Wang, Jincheng
    Ma, Xue
    Wen, Chenglin
    SENSORS, 2022, 22 (07)
  • [3] Federated Learning Differential Privacy Preservation Method Based on Differentiated Noise Addition
    Han, Liquan
    Fan, Di
    Liu, Jinyuan
    Du, Wei
    2023 8TH INTERNATIONAL CONFERENCE ON CLOUD COMPUTING AND BIG DATA ANALYTICS, ICCCBDA, 2023, : 285 - 289
  • [4] Byzantine-Robust Federated Learning with Variance Reduction and Differential Privacy
    Zhang, Zikai
    Hu, Rui
    2023 IEEE CONFERENCE ON COMMUNICATIONS AND NETWORK SECURITY, CNS, 2023,
  • [5] A Differential Privacy Federated Learning Scheme with Improved Noise Perturbation
    Liu, Chang
    He, Xiaowei
    Wang, Bin
    Sun, Xinru
    Luo, Yixuan
    Zeng, Yiji
    Wang, Xinyu
    Wang, Jianhang
    Zhao, Haofei
    ADVANCED INTELLIGENT COMPUTING TECHNOLOGY AND APPLICATIONS, PT IX, ICIC 2024, 2024, 14870 : 61 - 71
  • [6] Differential Privacy Federated Learning Based on Adaptive Adjustment
    Cheng, Yanjin
    Li, Wenmin
    Qin, Sujuan
    Tu, Tengfei
    CMC-COMPUTERS MATERIALS & CONTINUA, 2025, 82 (03): : 4777 - 4795
  • [7] A federated learning differential privacy algorithm for non-Gaussian heterogeneous data
    Yang, Xinyu
    Wu, Weisan
    SCIENTIFIC REPORTS, 2023, 13 (01)
  • [8] A federated learning differential privacy algorithm for non-Gaussian heterogeneous data
    Xinyu Yang
    Weisan Wu
    Scientific Reports, 13
  • [9] NIDS-FLGDP: Network Intrusion Detection Algorithm Based on Gaussian Differential Privacy Federated Learning
    Du, Jiawei
    Yang, Kai
    JOURNAL OF CIRCUITS SYSTEMS AND COMPUTERS, 2024, 33 (03)
  • [10] Privacy-preserving federated learning based on noise addition
    Wu, Xianlin
    Chen, Yuwen
    Yu, Haiyang
    Yang, Zhen
    EXPERT SYSTEMS WITH APPLICATIONS, 2025, 267