Federated Bayesian Deep Learning: The Application of Statistical Aggregation Methods to Bayesian Models

被引:1
|
作者
Fischer, John [1 ]
Orescanin, Marko [1 ]
Loomis, Justin [1 ]
Mcclure, Patrick [1 ]
机构
[1] Naval Postgrad Sch, Dept Comp Sci, Monterey, CA 93943 USA
来源
IEEE ACCESS | 2024年 / 12卷
关键词
Uncertainty; Bayes methods; Data models; Predictive models; Deep learning; Servers; Measurement uncertainty; Training; Gaussian distribution; Remote sensing; Bayesian deep learning; federated learning; Monte Carlo dropout; uncertainty decomposition; uncertainty quantification; variational inference; FORECAST UNCERTAINTY; INFLATION;
D O I
10.1109/ACCESS.2024.3513253
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Federated learning (FL) is an approach to training machine learning models that takes advantage of multiple distributed datasets while maintaining data privacy and reducing communication costs associated with sharing local datasets. Aggregation strategies have been developed to pool or fuse the weights and biases of distributed deterministic models; however, modern deterministic deep learning (DL) models are often poorly calibrated and lack the ability to communicate a measure of epistemic uncertainty in prediction, which is desirable for remote sensing platforms and safety-critical applications. Conversely, Bayesian DL models are often well calibrated and capable of quantifying and communicating a measure of epistemic uncertainty along with a competitive prediction accuracy. Unfortunately, because the weights and biases in Bayesian DL models are defined by a probability distribution, simple application of the aggregation methods associated with FL schemes for deterministic models is either impossible or results in sub-optimal performance. In this work, we use independent and identically distributed (IID) and non-IID partitions of the CIFAR-10 dataset and a fully variational ResNet-20 architecture to analyze six different aggregation strategies for Bayesian DL models. Additionally, we analyze the traditional federated averaging approach applied to an approximate Bayesian Monte Carlo dropout model as a lightweight alternative to more complex variational inference methods in FL. We show that aggregation strategy is a key hyperparameter in the design of a Bayesian FL system with downstream effects on accuracy, calibration, uncertainty quantification, training stability, and client compute requirements.
引用
收藏
页码:185790 / 185806
页数:17
相关论文
共 50 条
  • [21] Application of Bayesian Network Learning Methods in E-learning
    Yang, Qing
    Zhang, Lianfa
    Huang, Zhufeng
    Wang, Xueping
    DCABES 2008 PROCEEDINGS, VOLS I AND II, 2008, : 661 - +
  • [22] Federated Learning for Sparse Bayesian Models with Applications to Electronic Health Records and Genomics
    Kidd, Brian
    Wang, Kunbo
    Xu, Yanxun
    Ni, Yang
    BIOCOMPUTING 2023, PSB 2023, 2023, : 484 - 495
  • [23] Bayesian Nonparametric Federated Learning of Neural Networks
    Yurochkin, Mikhail
    Agarwal, Mayank
    Ghosh, Soumya
    Greenewald, Kristjan
    Hoang, Trong Nghia
    Khazaeni, Yasaman
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 97, 2019, 97
  • [24] BayBFed: Bayesian Backdoor Defense for Federated Learning
    Kumari, Kavita
    Rieger, Phillip
    Fereidooni, Hossein
    Jadliwala, Murtuza
    Sadeghi, Ahmad-Reza
    2023 IEEE SYMPOSIUM ON SECURITY AND PRIVACY, SP, 2023, : 737 - 754
  • [25] FedPop: A Bayesian Approach for Personalised Federated Learning
    Kotelevskii, Nikita
    Vono, Maxime
    Durmus, Alain
    Moulines, Eric
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35, NEURIPS 2022, 2022,
  • [26] Towards Bayesian Deep Learning: A Framework and Some Existing Methods
    Wang, Hao
    Yeung, Dit-Yan
    IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2016, 28 (12) : 3395 - 3408
  • [27] Bayesian Active Learning for Choice Models With Deep Gaussian Processes
    Yang, Jie
    Klabjan, Diego
    IEEE TRANSACTIONS ON INTELLIGENT TRANSPORTATION SYSTEMS, 2021, 22 (02) : 1080 - 1092
  • [28] Deep Bayesian Multimedia Learning
    Chien, Jen-Tzung
    MM '20: PROCEEDINGS OF THE 28TH ACM INTERNATIONAL CONFERENCE ON MULTIMEDIA, 2020, : 4791 - 4793
  • [29] Bayesian methods for autoregressive models
    Penny, WD
    Roberts, SJ
    NEURAL NETWORKS FOR SIGNAL PROCESSING X, VOLS 1 AND 2, PROCEEDINGS, 2000, : 125 - 134
  • [30] A Survey on Bayesian Deep Learning
    Wang, Hao
    Yeung, Dit-Yan
    ACM COMPUTING SURVEYS, 2020, 53 (05)