Differentially private stochastic gradient descent via compression and memorization

被引:9
|
作者
Phong, Le Trieu [1 ]
Phuong, Tran Thi [1 ,2 ]
机构
[1] Natl Inst Informat & Commun Technol NICT, Tokyo 1848795, Japan
[2] Meiji Univ, Kawasaki, Kanagawa 2148571, Japan
关键词
Differential privacy; Neural network; Stochastic gradient descent; Gradient compression and memorization; LOGISTIC-REGRESSION;
D O I
10.1016/j.sysarc.2022.102819
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
0812 ;
摘要
We propose a novel approach for achieving differential privacy for neural network training models through compression and memorization of gradients. The compression technique, which makes gradient vectors sparse, reduces the sensitivity so that differential privacy can be achieved with less noise; whereas the memorization technique, which remembers unused gradient parts, keeps track of the descent direction and thereby maintains the accuracy of the proposed algorithm. Our differentially private algorithm, called dp-memSGD for short, converges mathematically at the same rate of 1/root T as standard stochastic gradient descent (SGD) algorithm, where.. is the number of training iterations. Experimentally, we demonstrate that dp-memSGD converges with reasonable privacy losses on many benchmark datasets.
引用
收藏
页数:9
相关论文
共 50 条
  • [41] Graph Drawing by Stochastic Gradient Descent
    Zheng, Jonathan X.
    Pawar, Samraat
    Goodman, Dan F. M.
    IEEE TRANSACTIONS ON VISUALIZATION AND COMPUTER GRAPHICS, 2019, 25 (09) : 2738 - 2748
  • [42] On the discrepancy principle for stochastic gradient descent
    Jahn, Tim
    Jin, Bangti
    INVERSE PROBLEMS, 2020, 36 (09)
  • [43] Nonparametric Budgeted Stochastic Gradient Descent
    Trung Le
    Vu Nguyen
    Tu Dinh Nguyen
    Dinh Phung
    ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 51, 2016, 51 : 564 - 572
  • [44] Benign Underfitting of Stochastic Gradient Descent
    Koren, Tomer
    Livni, Roi
    Mansour, Yishay
    Sherman, Uri
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35, NEURIPS 2022, 2022,
  • [45] The effective noise of stochastic gradient descent
    Mignacco, Francesca
    Urbani, Pierfrancesco
    JOURNAL OF STATISTICAL MECHANICS-THEORY AND EXPERIMENT, 2022, 2022 (08):
  • [46] On the regularizing property of stochastic gradient descent
    Jin, Bangti
    Lu, Xiliang
    INVERSE PROBLEMS, 2019, 35 (01)
  • [47] Efficiency Ordering of Stochastic Gradient Descent
    Hu, Jie
    Doshi, Vishwaraj
    Eun, Do Young
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35 (NEURIPS 2022), 2022,
  • [48] Stochastic Gradient Descent on Riemannian Manifolds
    Bonnabel, Silvere
    IEEE TRANSACTIONS ON AUTOMATIC CONTROL, 2013, 58 (09) : 2217 - 2229
  • [49] A stochastic multiple gradient descent algorithm
    Mercier, Quentin
    Poirion, Fabrice
    Desideri, Jean-Antoine
    EUROPEAN JOURNAL OF OPERATIONAL RESEARCH, 2018, 271 (03) : 808 - 817
  • [50] Conjugate directions for stochastic gradient descent
    Schraudolph, NN
    Graepel, T
    ARTIFICIAL NEURAL NETWORKS - ICANN 2002, 2002, 2415 : 1351 - 1356