Revisiting Model-Agnostic Private Learning: Faster Rates and Active Learning

被引:0
|
作者
Liu, Chong [1 ]
Zhu, Yuqing [1 ]
Chaudhuri, Kamalika [2 ]
Wang, Yu-Xiang [1 ]
机构
[1] UC Santa Barbara, Santa Barbara, CA 93106 USA
[2] Univ Calif San Diego, La Jolla, CA USA
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The Private Aggregation of Teacher Ensembles (PATE) framework is one of the most promising recent approaches in differentially private learning. Existing theoretical analysis shows that PATE consistently learns any VC-classes in the realizable setting, but falls short in explaining its success in more general cases where the error rate of the optimal classifier is bounded away from zero. We fill in this gap by introducing the Tsybakov Noise Condition (TNC) and establish stronger and more interpretable learning bounds. These bounds provide new insights into when PATE works and improve over existing results even in the narrower realizable setting. We also investigate the compelling idea of using active learning for saving privacy budget. The novel components in the proofs include a more refined analysis of the majority voting classifier - which could be of independent interest and an observation that the synthetic "student" learning problem is nearly realizable by construction under the Tsybakov noise condition.
引用
收藏
页数:10
相关论文
共 50 条
  • [1] Revisiting model-agnostic private learning: Faster rates and active learning
    Liu, Chong
    Zhu, Yuqing
    Chaudhuri, Kamalika
    Wang, Yu-Xiang
    Journal of Machine Learning Research, 2021, 22
  • [2] Model-Agnostic Private Learning
    Bassily, Raef
    Thakkar, Om
    Thakurta, Abhradeep
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 31 (NIPS 2018), 2018, 31
  • [3] Model-Agnostic Federated Learning
    Mittone, Gianluca
    Riviera, Walter
    Colonnelli, Iacopo
    Birke, Robert
    Aldinucci, Marco
    EURO-PAR 2023: PARALLEL PROCESSING, 2023, 14100 : 383 - 396
  • [4] Is Bayesian Model-Agnostic Meta Learning Better than Model-Agnostic Meta Learning, Provably?
    Chen, Lisha
    Chen, Tianyi
    INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 151, 2022, 151
  • [5] Evaluating MASHAP as a faster alternative to LIME for model-agnostic machine learning interpretability
    Messalas, Andreas
    Aridas, Christos
    Kanellopoulos, Yannis
    2020 IEEE INTERNATIONAL CONFERENCE ON BIG DATA (BIG DATA), 2020, : 5777 - 5779
  • [6] Model-Agnostic Learning to Meta-Learn
    Devos, Arnout
    Dandi, Yatin
    NEURIPS 2020 WORKSHOP ON PRE-REGISTRATION IN MACHINE LEARNING, VOL 148, 2020, 148 : 155 - 175
  • [7] Bayesian Model-Agnostic Meta-Learning
    Yoon, Jaesik
    Kim, Taesup
    Dia, Ousmane
    Kim, Sungwoong
    Bengio, Yoshua
    Ahn, Sungjin
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 31 (NIPS 2018), 2018, 31
  • [8] Probabilistic Model-Agnostic Meta-Learning
    Finn, Chelsea
    Xu, Kelvin
    Levine, Sergey
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 31 (NIPS 2018), 2018, 31
  • [9] Meta weight learning via model-agnostic meta-learning
    Xu, Zhixiong
    Chen, Xiliang
    Tang, Wei
    Lai, Jun
    Cao, Lei
    NEUROCOMPUTING, 2021, 432 : 124 - 132
  • [10] Combining Model-Agnostic Meta-Learning and Transfer Learning for Regression
    Satrya, Wahyu Fadli
    Yun, Ji-Hoon
    SENSORS, 2023, 23 (02)