Privacy for Free: Posterior Sampling and Stochastic Gradient Monte Carlo

被引:0
|
作者
Wang, Yu-Xiang [1 ]
Fienberg, Stephen E. [1 ,2 ]
Smola, Alexander J. [1 ,3 ]
机构
[1] Carnegie Mellon Univ, Machine Learning Dept, Pittsburgh, PA 15213 USA
[2] Carnegie Mellon Univ, Dept Stat, Pittsburgh, PA 15213 USA
[3] Marianas Labs Inc, Pittsburgh, PA 15213 USA
基金
新加坡国家研究基金会;
关键词
COMPLEXITY; LANGEVIN;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
We consider the problem of Bayesian learning on sensitive datasets and present two simple but somewhat surprising results that connect Bayesian learning to "differential privacy", a cryptographic approach to protect individual-level privacy while permitting database-level utility. Specifically, we show that under standard assumptions, getting one sample from a posterior distribution is differentially private "for free"; and this sample as a statistical estimator is often consistent, near optimal, and computationally tractable. Similarly but separately, we show that a recent line of work that use stochastic gradient for Hybrid Monte Carlo (HMC) sampling also preserve differentially privacy with minor or no modifications of the algorithmic procedure at all, these observations lead to an "anytime" algorithm for Bayesian learning under privacy constraint. We demonstrate that it performs much better than the state-of-the-art differential private methods on synthetic and real datasets.
引用
收藏
页码:2493 / 2502
页数:10
相关论文
共 50 条
  • [21] Stochastic comparisons of stratified sampling techniques for some Monte Carlo estimators
    Goldstein, Larry
    Rinott, Yosef
    Scarsini, Marco
    BERNOULLI, 2011, 17 (02) : 592 - 608
  • [22] Importance Sampling in Stochastic Programming: A Markov Chain Monte Carlo Approach
    Parpas, Panos
    Ustun, Berk
    Webster, Mort
    Quang Kha Tran
    INFORMS JOURNAL ON COMPUTING, 2015, 27 (02) : 358 - 377
  • [23] On sequential monte carlo sampling of discretely observed stochastic differential equations
    Sarkka, Simo
    NSSPW: NONLINEAR STATISTICAL SIGNAL PROCESSING WORKSHOP: CLASSICAL, UNSCENTED AND PARTICLE FILTERING METHODS, 2006, : 21 - 24
  • [24] Global Optimization for Stochastic Programming via Sequential Monte Carlo Sampling
    Ni, Wei
    PROCEEDINGS OF THE 38TH CHINESE CONTROL CONFERENCE (CCC), 2019, : 2070 - 2075
  • [25] Image Registration via Stochastic Gradient Markov Chain Monte Carlo
    Grzech, Daniel
    Kainz, Bernhard
    Glocker, Ben
    Le Folgoc, Loic
    UNCERTAINTY FOR SAFE UTILIZATION OF MACHINE LEARNING IN MEDICAL IMAGING, AND GRAPHS IN BIOMEDICAL IMAGE ANALYSIS, UNSURE 2020, GRAIL 2020, 2020, 12443 : 3 - 12
  • [26] Stochastic Gradient Richardson-Romberg Markov Chain Monte Carlo
    Durmus, Alain
    Simsekli, Umut
    Moulines, Eric
    Badeau, Roland
    Richard, Gael
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 29 (NIPS 2016), 2016, 29
  • [27] Stochastic Gradient Hamiltonian Monte Carlo for non-convex learning
    Chau, Huy N.
    Rasonyi, Miklos
    STOCHASTIC PROCESSES AND THEIR APPLICATIONS, 2022, 149 : 341 - 368
  • [28] Stochastic gradient Hamiltonian Monte Carlo with variance reduction for Bayesian inference
    Li, Zhize
    Zhang, Tianyi
    Cheng, Shuyu
    Zhu, Jun
    Li, Jian
    MACHINE LEARNING, 2019, 108 (8-9) : 1701 - 1727
  • [29] sgmcmc: An R Package for Stochastic Gradient Markov Chain Monte Carlo
    Baker, Jack
    Fearnhead, Paul
    Fox, Emily B.
    Nemeth, Christopher
    JOURNAL OF STATISTICAL SOFTWARE, 2019, 91 (03): : 1 - 27
  • [30] Stochastic Gradient Hamiltonian Monte Carlo Methods with Recursive Variance Reduction
    Zou, Difan
    Xu, Pan
    Gu, Quanquan
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 32 (NIPS 2019), 2019, 32