Privacy Auditing in Differential Private Machine Learning: The Current Trends

被引:0
|
作者
Namatevs, Ivars [1 ]
Sudars, Kaspars [1 ]
Nikulins, Arturs [1 ]
Ozols, Kaspars [1 ]
机构
[1] Inst Elect & Comp Sci, 14 Dzerbenes St, LV-1006 Riga, Latvia
来源
APPLIED SCIENCES-BASEL | 2025年 / 15卷 / 02期
关键词
differential privacy; differential private machine learning; differential privacy auditing; privacy attacks; MEMBERSHIP INFERENCE ATTACKS; INFORMATION LEAKAGE; REGRESSION-MODELS; INVERSION; INTERVALS; MECHANISM; SECURITY; BOX;
D O I
10.3390/app15020647
中图分类号
O6 [化学];
学科分类号
0703 ;
摘要
Differential privacy has recently gained prominence, especially in the context of private machine learning. While the definition of differential privacy makes it possible to provably limit the amount of information leaked by an algorithm, practical implementations of differentially private algorithms often contain subtle vulnerabilities. Therefore, there is a need for effective methods that can audit (& varepsilon;,delta) differentially private algorithms before they are deployed in the real world. The article examines studies that recommend privacy guarantees for differential private machine learning. It covers a wide range of topics on the subject and provides comprehensive guidance for privacy auditing schemes based on privacy attacks to protect machine-learning models from privacy leakage. Our results contribute to the growing literature on differential privacy in the realm of privacy auditing and beyond and pave the way for future research in the field of privacy-preserving models.
引用
收藏
页数:54
相关论文
共 50 条
  • [41] The Limits of Differential Privacy (and Its Misuse in Data Release and Machine Learning)
    Domingo-Ferrer, Josep
    Sanchez, David
    Blanco-Justicia, Alberto
    COMMUNICATIONS OF THE ACM, 2021, 64 (07) : 33 - 35
  • [42] Helmholtz machine with differential privacy
    Hu, Junying
    Sun, Kai
    Zhang, Hai
    INFORMATION SCIENCES, 2022, 613 : 888 - 903
  • [43] Machine Learning Differential Privacy With Multifunctional Aggregation in a Fog Computing Architecture
    Yang, Mengmeng
    Zhu, Tianqing
    Liu, Bo
    Xiang, Yang
    Zhou, Wanlei
    IEEE ACCESS, 2018, 6 : 17119 - 17129
  • [44] Helmholtz machine with differential privacy
    Hu, Junying
    Sun, Kai
    Zhang, Hai
    Information Sciences, 2022, 613 : 888 - 903
  • [45] Current Trends and Applications of Machine Learning in Tribology-A Review
    Marian, Max
    Tremmel, Stephan
    LUBRICANTS, 2021, 9 (09)
  • [46] Machine Learning in Malware Analysis: Current Trends and Future Directions
    Altaha, Safa
    Riad, Khaled
    INTERNATIONAL JOURNAL OF ADVANCED COMPUTER SCIENCE AND APPLICATIONS, 2024, 15 (01) : 1267 - 1279
  • [47] LDP-FL: Practical Private Aggregation in Federated Learning with Local Differential Privacy
    Sun, Lichao
    Qian, Jianwei
    Chen, Xun
    PROCEEDINGS OF THE THIRTIETH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, IJCAI 2021, 2021, : 1571 - 1578
  • [48] Effects of Noise on Machine Learning Algorithms Using Local Differential Privacy Techniques
    Gadepally, Krishna Chaitanya
    Mangalampalli, Sameer
    2021 IEEE INTERNATIONAL IOT, ELECTRONICS AND MECHATRONICS CONFERENCE (IEMTRONICS), 2021, : 91 - 94
  • [49] Differential privacy in deep learning: Privacy and beyond
    Wang, Yanling
    Wang, Qian
    Zhao, Lingchen
    Wang, Cong
    FUTURE GENERATION COMPUTER SYSTEMS-THE INTERNATIONAL JOURNAL OF ESCIENCE, 2023, 148 : 408 - 424
  • [50] Signal Processing and Machine Learning with Differential Privacy [Algorithms and challenges for continuous data]
    Sarwate, Anand D.
    Chaudhuri, Kamalika
    IEEE SIGNAL PROCESSING MAGAZINE, 2013, 30 (05) : 86 - 94