Federated Learning via Meta-Variational Dropout

被引:0
|
作者
Jeon, Insu [1 ]
Hong, Minui [1 ]
Yun, Junhyeog [1 ]
Kim, Gunhee [1 ]
机构
[1] Seoul Natl Univ, Seoul, South Korea
基金
新加坡国家研究基金会;
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Federated Learning (FL) aims to train a global inference model from remotely distributed clients, gaining popularity due to its benefit of improving data privacy. However, traditional FL often faces challenges in practical applications, including model overfitting and divergent local models due to limited and non-IID data among clients. To address these issues, we introduce a novel Bayesian meta-learning approach called meta-variational dropout (MetaVD). MetaVD learns to predict client-dependent dropout rates via a shared hypernetwork, enabling effective model personalization of FL algorithms in limited non-IID data settings. We also emphasize the posterior adaptation view of meta-learning and the posterior aggregation view of Bayesian FL via the conditional dropout posterior. We conducted extensive experiments on various sparse and non-IID FL datasets. MetaVD demonstrated excellent classification accuracy and uncertainty calibration performance, especially for out-of-distribution (OOD) clients. MetaVD compresses the local model parameters needed for each client, mitigating model overfitting and reducing communication costs. Code is available at https://github.com/insujeon/MetaVD.
引用
收藏
页数:26
相关论文
共 50 条
  • [1] Meta-variational quantum Monte Carlo
    Tianchen Zhao
    James Stokes
    Shravan Veerapaneni
    Quantum Machine Intelligence, 2023, 5
  • [2] Meta-variational quantum Monte Carlo
    Zhao, Tianchen
    Stokes, James
    Veerapaneni, Shravan
    QUANTUM MACHINE INTELLIGENCE, 2023, 5 (01)
  • [3] Federated Learning for Indoor Localization via Model Reliability With Dropout
    Park, Junha
    Moon, Jiseon
    Kim, Taekyoon
    Wu, Peng
    Imbiriba, Tales
    Closas, Pau
    Kim, Sunwoo
    IEEE COMMUNICATIONS LETTERS, 2022, 26 (07) : 1553 - 1557
  • [4] Meta-Variational Quantum Eigensolver: Learning Energy Profiles of Parameterized Hamiltonians for Quantum Simulation
    Cervera-Lierta, Alba
    Kottmann, Jakob S.
    Aspuru-Guzik, Alan
    PRX QUANTUM, 2021, 2 (02):
  • [5] Personalized Federated Learning via Variational Bayesian Inference
    Zhang, Xu
    Li, Yinchuan
    Li, Wenpeng
    Guo, Kaiyang
    Shao, Yunfeng
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 162, 2022,
  • [6] A Collaborative Learning Framework via Federated Meta-Learning
    Lin, Sen
    Yang, Guang
    Zhang, Junshan
    2020 IEEE 40TH INTERNATIONAL CONFERENCE ON DISTRIBUTED COMPUTING SYSTEMS (ICDCS), 2020, : 289 - 299
  • [7] Adaptive Federated Dropout: Improving Communication Efficiency and Generalization for Federated Learning
    Bouacida, Nader
    Hou, Jiahui
    Zang, Hui
    Liu, Xin
    IEEE CONFERENCE ON COMPUTER COMMUNICATIONS WORKSHOPS (IEEE INFOCOM WKSHPS 2021), 2021,
  • [8] Confidence-aware Personalized Federated Learning via Variational Expectation Maximization
    Zhu, Junyi
    Ma, Xingchen
    Blaschko, Matthew B.
    2023 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2023, : 24542 - 24551
  • [9] Federated Generalized Bayesian Learning via Distributed Stein Variational Gradient Descent
    Kassab, Rahif
    Simeone, Osvaldo
    IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2022, 70 : 2180 - 2192
  • [10] Solving Client Dropout in Federated Learning via Client Similarity Discovery and Gradient Supplementation Mechanism
    Yan, Maoxuan
    Luo, Qingcai
    Zhang, Bo
    Sun, Shanbao
    ALGORITHMS AND ARCHITECTURES FOR PARALLEL PROCESSING, ICA3PP 2023, PT V, 2024, 14491 : 446 - 457