A Communication-Efficient, Privacy-Preserving Federated Learning Algorithm Based on Two-Stage Gradient Pruning and Differentiated Differential Privacy

被引:2
|
作者
Li, Yong [1 ,2 ,3 ]
Du, Wei [1 ]
Han, Liquan [1 ]
Zhang, Zhenjian [1 ]
Liu, Tongtong [1 ]
机构
[1] Changchun Univ Technol, Sch Comp Sci & Engn, Changchun 130012, Peoples R China
[2] Changchun Univ Technol, AI Res Inst, Changchun 130012, Peoples R China
[3] Jilin Univ, Sch Comp Sci & Technol, Changchun 130012, Peoples R China
关键词
differentiated differential privacy; federated learning; gradient pruning; privacy preserving;
D O I
10.3390/s23239305
中图分类号
O65 [分析化学];
学科分类号
070302 ; 081704 ;
摘要
There are several unsolved problems in federated learning, such as the security concerns and communication costs associated with it. Differential privacy (DP) offers effective privacy protection by introducing noise to parameters based on rigorous privacy definitions. However, excessive noise addition can potentially compromise the accuracy of the model. Another challenge in federated learning is the issue of high communication costs. Training large-scale federated models can be slow and expensive in terms of communication resources. To address this, various model pruning algorithms have been proposed. To address these challenges, this paper introduces a communication-efficient, privacy-preserving FL algorithm based on two-stage gradient pruning and differentiated differential privacy, named IsmDP-FL. The algorithm leverages a two-stage approach, incorporating gradient pruning and differentiated differential privacy. In the first stage, the trained model is subject to gradient pruning, followed by the addition of differential privacy to the important parameters selected after pruning. Non-important parameters are pruned by a certain ratio, and differentiated differential privacy is applied to the remaining parameters in each network layer. In the second stage, gradient pruning is performed during the upload to the server for aggregation, and the final result is returned to the client to complete the federated learning process. Extensive experiments demonstrate that the proposed method ensures a high communication efficiency, maintains the model privacy, and reduces the unnecessary use of the privacy budget.
引用
收藏
页数:21
相关论文
共 50 条
  • [31] Layer-Based Communication-Efficient Federated Learning with Privacy Preservation
    Lian, Zhuotao
    Wang, Weizheng
    Huang, Huakun
    Su, Chunhua
    IEICE TRANSACTIONS ON INFORMATION AND SYSTEMS, 2022, E105D (02) : 256 - 263
  • [32] Efficient and Privacy-Preserving Federated Learning with Irregular Users
    Xu, Jieyu
    Li, Hongwei
    Zeng, Jia
    Hao, Meng
    IEEE INTERNATIONAL CONFERENCE ON COMMUNICATIONS (ICC 2022), 2022, : 534 - 539
  • [33] An efficient privacy-preserving and verifiable scheme for federated learning
    Yang, Xue
    Ma, Minjie
    Tang, Xiaohu
    FUTURE GENERATION COMPUTER SYSTEMS-THE INTERNATIONAL JOURNAL OF ESCIENCE, 2024, 160 : 238 - 250
  • [34] A Two-Stage Differential Privacy Scheme for Federated Learning Based on Edge Intelligence
    Zhang, Li
    Xu, Jianbo
    Sivaraman, Audithan
    Lazarus, Jegatha Deborah
    Sharma, Pradip Kumar
    Pandi, Vijayakumar
    IEEE JOURNAL OF BIOMEDICAL AND HEALTH INFORMATICS, 2024, 28 (06) : 3349 - 3360
  • [35] Privacy-preserving quantum federated learning via gradient hiding
    Li, Changhao
    Kumar, Niraj
    Song, Zhixin
    Chakrabarti, Shouvanik
    Pistoia, Marco
    QUANTUM SCIENCE AND TECHNOLOGY, 2024, 9 (03):
  • [36] An Efficient and Privacy-Preserving Federated Learning Approach Based on Homomorphic Encryption
    Castro, Francesco
    Impedovo, Donato
    Pirlo, Giuseppe
    IEEE OPEN JOURNAL OF THE COMPUTER SOCIETY, 2025, 6 : 336 - 347
  • [37] Preserving data privacy in federated learning through large gradient pruning
    Zhang, Zhiqiu
    Tianqing, Zhu
    Ren, Wei
    Xiong, Ping
    Choo, Kim-Kwang Raymond
    COMPUTERS & SECURITY, 2023, 125
  • [38] An Efficient and Secure Privacy-Preserving Federated Learning Framework Based on Multiplicative Double Privacy Masking
    Shen, Cong
    Zhang, Wei
    Zhou, Tanping
    Zhang, Yiming
    Zhang, Lingling
    CMC-COMPUTERS MATERIALS & CONTINUA, 2024, 80 (03): : 4729 - 4748
  • [39] P2CEFL: Privacy-Preserving and Communication Efficient Federated Learning With Sparse Gradient and Dithering Quantization
    Wang, Gang
    Qi, Qi
    Han, Rui
    Bai, Lin
    Choi, Jinho
    IEEE TRANSACTIONS ON MOBILE COMPUTING, 2024, 23 (12) : 14722 - 14736
  • [40] Learned Parameter Compression for Efficient and Privacy-Preserving Federated Learning
    Chen, Yiming
    Abrahamyan, Lusine
    Sahli, Hichem
    Deligiannis, Nikos
    IEEE OPEN JOURNAL OF THE COMMUNICATIONS SOCIETY, 2024, 5 : 3503 - 3516