Differential Privacy in Distributed Optimization With Gradient Tracking

被引:6
|
作者
Huang, Lingying [1 ]
Wu, Junfeng [2 ]
Shi, Dawei [3 ]
Dey, Subhrakanti [4 ]
Shi, Ling [1 ]
机构
[1] Hong Kong Univ Sci & Technol, Dept Elect & Comp Engn, Hong Kong, Peoples R China
[2] Chinese Univ Hong Kong, Sch Data Sci, Shenzhen 518172, Peoples R China
[3] Beijing Inst Technol, Sch Automat, State Key Lab Intelligent Control & Decis Complex, Beijing 100081, Peoples R China
[4] Uppsala Univ, Dept Elect Engn, SE-75121 Uppsala, Sweden
基金
中国国家自然科学基金;
关键词
Differential privacy (DP); directed graph; distributed optimization; gradient tracking; ALGORITHMS;
D O I
10.1109/TAC.2024.3352328
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Optimization with gradient tracking is particularly notable for its superior convergence results among the various distributed algorithms, especially in the context of directed graphs. However, privacy concerns arise when gradient information is transmitted directly which would induce more information leakage. Surprisingly, literature has not adequately addressed the associated privacy issues. In response to the gap, our article proposes a privacy-preserving distributed optimization algorithm with gradient tracking by adding noises to transmitted messages, namely, the decision variables and the estimate of the aggregated gradient. We prove two dilemmas for this kind of algorithm. In the first dilemma, we reveal that this distributed optimization algorithm with gradient tracking cannot achieve epsilon-differential privacy (DP) and exact convergence simultaneously. Building on this, we subsequently highlight that the algorithm fails to achieve epsilon-DP when employing nonsummable stepsizes in the presence of Laplace noises. It is crucial to emphasize that these findings hold true regardless of the size of the privacy metric epsilon. After that, we rigorously analyze the convergence performance and privacy level given summable stepsize sequences under the Laplace distribution since it is only with summable stepsizes that is meaningful for us to study. We derive sufficient conditions that allow for the simultaneous stochastically bounded accuracy and epsilon-DP. Recognizing that several options can meet these conditions, we further derive an upper bound of the mean error's variance and specify the mathematical expression of epsilon under such conditions. Numerical simulations are provided to demonstrate the effectiveness of our proposed algorithm.
引用
收藏
页码:5727 / 5742
页数:16
相关论文
共 50 条
  • [41] Distributed stochastic gradient tracking methods
    Pu, Shi
    Nedic, Angelia
    MATHEMATICAL PROGRAMMING, 2021, 187 (1-2) : 409 - 457
  • [42] Privacy-Aware Eye Tracking Using Differential Privacy
    Steil, Julian
    Hagestedt, Inken
    Huang, Michael Xuelin
    Bulling, Andreas
    ETRA 2019: 2019 ACM SYMPOSIUM ON EYE TRACKING RESEARCH & APPLICATIONS, 2019,
  • [43] Communication efficient privacy-preserving distributed optimization using adaptive differential quantization
    Li, Qiongxiu
    Heusdens, Richard
    Christensen, Mads Graesboll
    SIGNAL PROCESSING, 2022, 194
  • [44] An Uplink Communication-Efficient Approach to Featurewise Distributed Sparse Optimization With Differential Privacy
    Lou, Jian
    Cheung, Yiu-ming
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2021, 32 (10) : 4529 - 4543
  • [45] Security and Privacy for Distributed Optimization & Distributed Machine Learning
    Vaidya, Nitin H.
    PROCEEDINGS OF THE 2021 ACM SYMPOSIUM ON PRINCIPLES OF DISTRIBUTED COMPUTING (PODC '21), 2021, : 573 - 573
  • [46] DISTRIBUTED OPTIMIZATION BASED ON GRADIENT TRACKING REVISITED: ENHANCING CONVERGENCE RATE VIA SURROGATION
    Sun, Ying
    Scutari, Gesualdo
    Daneshmand, Amir
    SIAM JOURNAL ON OPTIMIZATION, 2022, 32 (02) : 354 - 385
  • [47] Distributed stochastic gradient tracking methods with momentum acceleration for non-convex optimization
    Juan Gao
    Xin-Wei Liu
    Yu-Hong Dai
    Yakui Huang
    Junhua Gu
    Computational Optimization and Applications, 2023, 84 : 531 - 572
  • [48] Distributed stochastic gradient tracking methods with momentum acceleration for non-convex optimization
    Gao, Juan
    Liu, Xin-Wei
    Dai, Yu-Hong
    Huang, Yakui
    Gu, Junhua
    COMPUTATIONAL OPTIMIZATION AND APPLICATIONS, 2023, 84 (02) : 531 - 572
  • [49] Distributed Stochastic Gradient Tracking Algorithm With Variance Reduction for Non-Convex Optimization
    Jiang, Xia
    Zeng, Xianlin
    Sun, Jian
    Chen, Jie
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2023, 34 (09) : 5310 - 5321
  • [50] Confidence region for distributed stochastic optimization problem via stochastic gradient tracking method
    Zhao, Shengchao
    Liu, Yongchao
    AUTOMATICA, 2024, 159