Federated Learning with Privacy Preservation in Large-Scale Distributed Systems Using Differential Privacy and Homomorphic Encryption

被引:0
|
作者
Chen, Yue [1 ]
Yang, Yufei [1 ]
Liang, Yingwei [1 ]
Zhu, Taipeng [1 ]
Huang, Dehui [2 ]
机构
[1] Information Center, Guangdong Power Grid Co., Ltd., Guangdong, Guangzhou,510699, China
[2] Chaozhou Power Supply Bureau Information Center, Guangdong Power Grid Co., Ltd., Guangdong, Chaozhou,521011, China
来源
Informatica (Slovenia) | 2025年 / 49卷 / 13期
关键词
Adversarial machine learning - Contrastive Learning - Differential privacy;
D O I
10.31449/inf.v49i13.7358
中图分类号
学科分类号
摘要
This study proposes a large-scale distributed privacy-preserving machine learning algorithm based on federated learning. The algorithm allows participants to jointly train high-quality models without sharing original data to meet the challenges brought by increasingly stringent data privacy and security regulations. To verify the performance of the federated learning system in a real-world environment, we built a distributed experimental platform consisting of multiple physical servers and evaluated it using several publicly available datasets such as MNIST, Federated EMNIST, and Federated CIFAR10/100. The experimental results show that the accuracy of the federated learning system is 97.3%, which is slightly lower than the 98.2% of the centralized learning method, but this is an acceptable trade-off considering the advantages of the federated learning method in protecting data privacy. In addition, our system only slightly drops to about 96.8% after the introduction of malicious clients, which proves the robustness of the federated learning system. Specifically, we adopt differential privacy technology, set the privacy budget Ε=1.0, and add Gaussian noise to the model update to ensure that even if a malicious user accesses the model update, no sensitive information of any individual user can be inferred from it. The experimental conditions include but are not limited to: the communication protocol uses homomorphic encryption, the average communication volume per iteration is 150 MB, and the total communication volume is 30 GB; the average CPU utilization of the client is about 70%, and the GPU utilization is about 80%. These settings ensure the efficiency of the system's computing resources, and also reflect the balance between privacy protection and model performance. © 2025 Slovene Society Informatika. All rights reserved.
引用
收藏
页码:123 / 142
相关论文
共 50 条
  • [31] Privacy-Preserving Keystroke Analysis using Fully Homomorphic Encryption & Differential Privacy
    Loya, Jatan
    Bana, Tejas
    2021 INTERNATIONAL CONFERENCE ON CYBERWORLDS (CW 2021), 2021, : 291 - 294
  • [32] Lightweight Federated Learning for Large-Scale IoT Devices With Privacy Guarantee
    Wei, Zhaohui
    Pei, Qingqi
    Zhang, Ning
    Liu, Xuefeng
    Wu, Celimuge
    Taherkordi, Amirhosein
    IEEE INTERNET OF THINGS JOURNAL, 2023, 10 (04) : 3179 - 3191
  • [33] Homomorphic Encryption Based Privacy-Preservation for IoMT
    Salim, Mikail Mohammed
    Kim, Inyeung
    Doniyor, Umarov
    Lee, Changhoon
    Park, Jong Hyuk
    APPLIED SCIENCES-BASEL, 2021, 11 (18):
  • [34] Distributed Learning for Large-Scale Models at Edge With Privacy Protection
    Yuan, Yuan
    Chen, Shuzhen
    Yu, Dongxiao
    Zhao, Zengrui
    Zou, Yifei
    Cui, Lizhen
    Cheng, Xiuzhen
    IEEE TRANSACTIONS ON COMPUTERS, 2024, 73 (04) : 1060 - 1070
  • [35] Privacy preservation for machine learning training and classification based on homomorphic encryption schemes
    Li, Jing
    Kuang, Xiaohui
    Lin, Shujie
    Ma, Xu
    Tang, Yi
    INFORMATION SCIENCES, 2020, 526 : 166 - 179
  • [36] Blockchain-Enabled Federated Learning Data Protection Aggregation Scheme With Differential Privacy and Homomorphic Encryption in IIoT
    Jia, Bin
    Zhang, Xiaosong
    Liu, Jiewen
    Zhang, Yang
    Huang, Ke
    Liang, Yongquan
    IEEE TRANSACTIONS ON INDUSTRIAL INFORMATICS, 2022, 18 (06) : 4049 - 4058
  • [37] PPDL - Privacy Preserving Deep Learning Using Homomorphic Encryption
    Jain, Nayna
    Nandakumar, Karthik
    Ratha, Nalini
    Pankanti, Sharath
    Kumar, Uttam
    PROCEEDINGS OF THE 5TH JOINT INTERNATIONAL CONFERENCE ON DATA SCIENCE & MANAGEMENT OF DATA, CODS COMAD 2022, 2022, : 318 - 319
  • [38] A Fully Privacy-Preserving Solution for Anomaly Detection in IoT using Federated Learning and Homomorphic Encryption
    Arazzi, Marco
    Nicolazzo, Serena
    Nocera, Antonino
    INFORMATION SYSTEMS FRONTIERS, 2023, 27 (1) : 367 - 390
  • [39] Privacy-preserving federated learning based on multi-key homomorphic encryption
    Ma, Jing
    Naas, Si-Ahmed
    Sigg, Stephan
    Lyu, Xixiang
    INTERNATIONAL JOURNAL OF INTELLIGENT SYSTEMS, 2022, 37 (09) : 5880 - 5901
  • [40] FedSHE: privacy preserving and efficient federated learning with adaptive segmented CKKS homomorphic encryption
    Pan, Yao
    Chao, Zheng
    He, Wang
    Jing, Yang
    Li, Hongjia
    Wang, Liming
    CYBERSECURITY, 2024, 7 (01):