Differential Privacy in Distributed Optimization With Gradient Tracking

被引:6
|
作者
Huang, Lingying [1 ]
Wu, Junfeng [2 ]
Shi, Dawei [3 ]
Dey, Subhrakanti [4 ]
Shi, Ling [1 ]
机构
[1] Hong Kong Univ Sci & Technol, Dept Elect & Comp Engn, Hong Kong, Peoples R China
[2] Chinese Univ Hong Kong, Sch Data Sci, Shenzhen 518172, Peoples R China
[3] Beijing Inst Technol, Sch Automat, State Key Lab Intelligent Control & Decis Complex, Beijing 100081, Peoples R China
[4] Uppsala Univ, Dept Elect Engn, SE-75121 Uppsala, Sweden
基金
中国国家自然科学基金;
关键词
Differential privacy (DP); directed graph; distributed optimization; gradient tracking; ALGORITHMS;
D O I
10.1109/TAC.2024.3352328
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Optimization with gradient tracking is particularly notable for its superior convergence results among the various distributed algorithms, especially in the context of directed graphs. However, privacy concerns arise when gradient information is transmitted directly which would induce more information leakage. Surprisingly, literature has not adequately addressed the associated privacy issues. In response to the gap, our article proposes a privacy-preserving distributed optimization algorithm with gradient tracking by adding noises to transmitted messages, namely, the decision variables and the estimate of the aggregated gradient. We prove two dilemmas for this kind of algorithm. In the first dilemma, we reveal that this distributed optimization algorithm with gradient tracking cannot achieve epsilon-differential privacy (DP) and exact convergence simultaneously. Building on this, we subsequently highlight that the algorithm fails to achieve epsilon-DP when employing nonsummable stepsizes in the presence of Laplace noises. It is crucial to emphasize that these findings hold true regardless of the size of the privacy metric epsilon. After that, we rigorously analyze the convergence performance and privacy level given summable stepsize sequences under the Laplace distribution since it is only with summable stepsizes that is meaningful for us to study. We derive sufficient conditions that allow for the simultaneous stochastically bounded accuracy and epsilon-DP. Recognizing that several options can meet these conditions, we further derive an upper bound of the mean error's variance and specify the mathematical expression of epsilon under such conditions. Numerical simulations are provided to demonstrate the effectiveness of our proposed algorithm.
引用
收藏
页码:5727 / 5742
页数:16
相关论文
共 50 条
  • [21] Differential privacy distributed learning under chaotic quantum particle swarm optimization
    Yun Xie
    Peng Li
    Jindan Zhang
    Marek R. Ogiela
    Computing, 2021, 103 : 449 - 472
  • [22] Differential privacy distributed learning under chaotic quantum particle swarm optimization
    Xie, Yun
    Li, Peng
    Zhang, Jindan
    Ogiela, Marek R.
    COMPUTING, 2021, 103 (03) : 449 - 472
  • [23] A Distributed Nesterov-Like Gradient Tracking Algorithm for Composite Constrained Optimization
    Zheng, Lifeng
    Li, Huaqing
    Li, Jun
    Wang, Zheng
    Lu, Qingguo
    Shi, Yawei
    Wang, Huiwei
    Dong, Tao
    Ji, Lianghao
    Xia, Dawen
    IEEE TRANSACTIONS ON SIGNAL AND INFORMATION PROCESSING OVER NETWORKS, 2023, 9 : 60 - 73
  • [24] Communication-Efficient Distributed Optimization in Networks with Gradient Tracking and Variance Reduction
    Li, Boyue
    Cen, Shicong
    Chen, Yuxin
    Chi, Yuejie
    JOURNAL OF MACHINE LEARNING RESEARCH, 2020, 21
  • [25] A System Theoretical Perspective to Gradient-Tracking Algorithms for Distributed Quadratic Optimization
    Bin, Michelangelo
    Notarnicola, Ivano
    Marconi, Lorenzo
    Notarstefano, Giuseppe
    2019 IEEE 58TH CONFERENCE ON DECISION AND CONTROL (CDC), 2019, : 2994 - 2999
  • [26] Convergence of Distributed Gradient-Tracking-Based Optimization Algorithms with Random Graphs
    WANG Jiexiang
    FU Keli
    GU Yu
    LI Tao
    Journal of Systems Science & Complexity, 2021, 34 (04) : 1438 - 1453
  • [27] Communication-efficient distributed optimization in networks with gradient tracking and variance reduction
    Li, Boyue
    Cen, Shicong
    Chen, Yuxin
    Chi, Yuejie
    Journal of Machine Learning Research, 2020, 21
  • [28] Distributed stochastic optimization with gradient tracking over strongly-connected networks
    Xin, Ran
    Sahu, Anit Kumar
    Khan, Usman A.
    Kar, Soummya
    2019 IEEE 58TH CONFERENCE ON DECISION AND CONTROL (CDC), 2019, : 8353 - 8358
  • [29] Communication-Efficient Distributed Optimization in Networks with Gradient Tracking and Variance Reduction
    Li, Boyue
    Cen, Shicong
    Chen, Yuxin
    Chi, Yuejie
    INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 108, 2020, 108 : 1464 - 1473
  • [30] Convergence of Distributed Gradient-Tracking-Based Optimization Algorithms with Random Graphs
    Wang, Jiexiang
    Fu, Keli
    Gu, Yu
    Li, Tao
    JOURNAL OF SYSTEMS SCIENCE & COMPLEXITY, 2021, 34 (04) : 1438 - 1453