Joint Optimal Quantization and Aggregation of Federated Learning Scheme in VANETs

被引:19
|
作者
Li, Yifei [1 ]
Guo, Yijia [2 ]
Alazab, Mamoun [3 ]
Chen, Shengbo [1 ]
Shen, Cong [4 ]
Yu, Keping [5 ]
机构
[1] Henan Univ, Sch Comp & Informat Engn, Kaifeng 475001, Peoples R China
[2] Beihang Univ, Sch Automat Sci & Elect Engn, Beijing 100190, Peoples R China
[3] Charles Darwin Univ, Coll Engn IT & Environm, Casuarina, NT 0810, Australia
[4] Univ Virginia, Charles L Brown Dept Elect & Comp Engn, Charlottesville, VA 22904 USA
[5] Waseda Univ, Global Informat & Telecommun Inst, Shinjuku Ku, Tokyo 1698050, Japan
基金
中国国家自然科学基金; 日本学术振兴会;
关键词
Quantization (signal); Servers; Collaborative work; Optimization; Data models; Computational modeling; Standards; Artificial intelligence; vehicular ad hoc networks; federated learning; quantization; VEHICLES;
D O I
10.1109/TITS.2022.3145823
中图分类号
TU [建筑科学];
学科分类号
0813 ;
摘要
Vehicular ad hoc networks (VANETs) is one of the most promising approaches for the Intelligent Transportation Systems (ITS). With the rapid increase in the amount of traffic data, deep learning based algorithms have been used extensively in VANETs. The recently proposed federated learning is an attractive candidate for collaborative machine learning where instead of transferring a plethora of data to a centralized server, all clients train their respective local models and upload them to the server for model aggregation. Model quantization is an effective approach to address the communication efficiency issue in federated learning, and yet existing studies largely assume homogeneous quantization for all clients. However, in reality, clients are predominantly heterogeneous, where they support different quantization precision levels. In this work, we propose FedDO - Federated Learning with Double Optimization. Minimizing the drift term in the convergence analysis, which is a weighted sum of squared quantization errors (SQE) over all clients, leads to a double optimization at both clients and server sides. In particular, each client adopts a fully distributed, instantaneous (per learning round) and individualized (per client) quantization scheme that minimizes its own squared quantization error, and the server computes the aggregation weights that minimize the weighted sum of squared quantization errors over all clients. We show via numerical experiments that the minimal-SQE quantizer has a better performance than a widely adopted linear quantizer for federated learning. We also demonstrate the performance advantages of FedDO over the vanilla FedAvg with standard equal weights and linear quantization.
引用
收藏
页码:19852 / 19863
页数:12
相关论文
共 50 条
  • [1] A Privacy-Preserving Aggregation Scheme With Continuous Authentication for Federated Learning in VANETs
    Feng, Xia
    Wang, Xiaofeng
    Liu, Haiyang
    Yang, Haowei
    Wang, Liangmin
    IEEE TRANSACTIONS ON VEHICULAR TECHNOLOGY, 2024, 73 (07) : 9465 - 9477
  • [2] Global Aggregation Node Selection Scheme in Federated Learning for Vehicular Ad Hoc Networks (VANETs)
    Trabelsi, Zouheir
    Qayyum, Tariq
    Hayawi, Kadhim
    Ali, Muhammad
    2022 IEEE INTERNATIONAL CONFERENCE ON OMNI-LAYER INTELLIGENT SYSTEMS (IEEE COINS 2022), 2022, : 246 - 251
  • [3] Dynamic Aggregation for Heterogeneous Quantization in Federated Learning
    Chen, Shengbo
    Shen, Cong
    Zhang, Lanxue
    Tang, Yuanmin
    IEEE TRANSACTIONS ON WIRELESS COMMUNICATIONS, 2021, 20 (10) : 6804 - 6819
  • [4] Joint Privacy Enhancement and Quantization in Federated Learning
    Lang, Natalie
    Sofer, Elad
    Shaked, Tomer
    Shlezinger, Nir
    IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2023, 71 : 295 - 310
  • [5] HeteroSAg: Secure Aggregation With Heterogeneous Quantization in Federated Learning
    Elkordy, Ahmed Roushdy
    Avestimehr, A. Salman
    IEEE TRANSACTIONS ON COMMUNICATIONS, 2022, 70 (04) : 2372 - 2386
  • [6] Batch-Aggregate: Efficient Aggregation for Private Federated Learning in VANETs
    Feng, Xia
    Liu, Haiyang
    Yang, Haowei
    Xie, Qingqing
    Wang, Liangmin
    IEEE TRANSACTIONS ON DEPENDABLE AND SECURE COMPUTING, 2024, 21 (05) : 4939 - 4952
  • [7] Verifiable and Secure Aggregation Scheme for Federated Learning
    Ren Y.
    Fu Y.
    Li Y.
    Beijing Youdian Daxue Xuebao/Journal of Beijing University of Posts and Telecommunications, 2023, 46 (03): : 49 - 55
  • [8] A federated learning scheme for hierarchical protection and multiple aggregation
    Wang, Zhiqiang
    Yu, Xinyue
    Wang, Haoyu
    Xue, Peiyang
    COMPUTERS & ELECTRICAL ENGINEERING, 2024, 117
  • [9] A Secure Aggregation Scheme for Model Update in Federated Learning
    Wang, Baolin
    Hu, Chunqiang
    Liu, Zewei
    WIRELESS ALGORITHMS, SYSTEMS, AND APPLICATIONS (WASA 2022), PT I, 2022, 13471 : 500 - 512
  • [10] Federated Learning With Heterogeneous Quantization Bit Allocation and Aggregation for Internet of Things
    Chen, Shengbo
    Li, Le
    Wang, Guanghui
    Pang, Meng
    Shen, Cong
    IEEE INTERNET OF THINGS JOURNAL, 2024, 11 (02) : 3132 - 3143