Minimum Gaussian Noise Variance of Federated Learning in the Presence of Mutual Information Based Differential Privacy

被引:0
|
作者
He, Hua [1 ,2 ,3 ]
He, Zheng [4 ]
机构
[1] Chongqing Technol & Business Univ, Sch Business Adm, Chongqing 400067, Peoples R China
[2] Geely Univ China, Sch Business, Chengdu 610000, Peoples R China
[3] Krirk Univ, Int Coll, Bangkok 10220, Thailand
[4] Southwest Jiaotong Univ, Sch Informat Sci & Technol, Chengdu 610031, Peoples R China
关键词
Servers; Data privacy; Data models; Training; Privacy; Gaussian noise; Solid modeling; Differential privacy; Federated learning; Mutual information; federated learning; mutual information; privacy-utility trade-off; TRADEOFFS;
D O I
10.1109/ACCESS.2023.3323020
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Federated learning (FL), which not only protects data security and privacy, but also training models on distributed devices, receives much attention in the literature. Traditionally, stochastic gradient descent (SGD) is used in FL since its excellent empirical performance, but user's private information can still be leaked by analyzing weight updates during FL iterations. Differential privacy (DP) is an effective way to solve this privacy leakage problem, which adds noise to the user's data gradient and this artificial noise helps to prevent information leakage. However, note that the SGD based FL with DP is not yet investigated with a comprehensive theoretical analysis considering privacy and data utility jointly, especially from the information-theoretic aspect. In this paper, we investigate the FL in the presence of mutual information based DP (MI-DP). Specifically, first, Gaussian DP mechanism is applied to either clients or central server of the FL model, and privacy and utility of the FL model are characterized by conditional mutual information and distortion, respectively. For a given privacy budget, we establish lower bounds on the variance of the Gaussian noise added to clients or central server of the FL model, and show that the utility of the global model remains the same for both cases. Next, we study the privacy-utility trade-off problem by considering a more general case, where both the model parameter and the privacy requirement of the clients are flexible. A privacy-preserving scheme is proposed, which maximizes the utility of the global model while different privacy requirements of all clients are preserved. Finally, the results of this paper are further explained by experimental results.
引用
收藏
页码:111212 / 111225
页数:14
相关论文
共 50 条
  • [21] Sybil Attacks and Defense on Differential Privacy based Federated Learning
    Jiang, Yupeng
    Li, Yong
    Zhou, Yipeng
    Zheng, Xi
    2021 IEEE 20TH INTERNATIONAL CONFERENCE ON TRUST, SECURITY AND PRIVACY IN COMPUTING AND COMMUNICATIONS (TRUSTCOM 2021), 2021, : 355 - 362
  • [22] Federated Learning with Privacy-Preserving Active Learning: A Min-Max Mutual Information Approach
    Alsulaimawi, Zahir (alsulaiz@oregonstate.edu), 1600, Institute of Electrical and Electronics Engineers Inc.
  • [23] A secure and privacy preserved infrastructure for VANETs based on federated learning with local differential privacy
    Batool, Hajira
    Anjum, Adeel
    Khan, Abid
    Izzo, Stefano
    Mazzocca, Carlo
    Jeon, Gwanggil
    INFORMATION SCIENCES, 2024, 652
  • [24] Privacy-Preserving Federated Learning based on Differential Privacy and Momentum Gradient Descent
    Weng, Shangyin
    Zhang, Lei
    Feng, Daquan
    Feng, Chenyuan
    Wang, Ruiyu
    Klaine, Paulo Valente
    Imran, Muhammad Ali
    2022 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2022,
  • [25] A secure and privacy preserved infrastructure for VANETs based on federated learning with local differential privacy
    Batool, Hajira
    Anjum, Adeel
    Khan, Abid
    Izzo, Stefano
    Mazzocca, Carlo
    Jeon, Gwanggil
    Information Sciences, 2024, 652
  • [26] Evaluating Differential Privacy in Federated Continual Learning
    Ouyang, Junyan
    Han, Rui
    Liu, Chi Harold
    2023 IEEE 98TH VEHICULAR TECHNOLOGY CONFERENCE, VTC2023-FALL, 2023,
  • [27] Vertically Federated Learning with Correlated Differential Privacy
    Zhao, Jianzhe
    Wang, Jiayi
    Li, Zhaocheng
    Yuan, Weiting
    Matwin, Stan
    ELECTRONICS, 2022, 11 (23)
  • [28] Decentralized Wireless Federated Learning With Differential Privacy
    Chen, Shuzhen
    Yu, Dongxiao
    Zou, Yifei
    Yu, Jiguo
    Cheng, Xiuzhen
    IEEE TRANSACTIONS ON INDUSTRIAL INFORMATICS, 2022, 18 (09) : 6273 - 6282
  • [29] Enhancing Differential Privacy for Federated Learning at Scale
    Baek, Chunghun
    Kim, Sungwook
    Nam, Dongkyun
    Park, Jihoon
    IEEE ACCESS, 2021, 9 : 148090 - 148103
  • [30] Differential Privacy Federated Learning: A Comprehensive Review
    Shan, Fangfang
    Mao, Shiqi
    Lu, Yanlong
    Li, Shuaifeng
    INTERNATIONAL JOURNAL OF ADVANCED COMPUTER SCIENCE AND APPLICATIONS, 2024, 15 (07) : 220 - 230