Joint Optimal Quantization and Aggregation of Federated Learning Scheme in VANETs

被引:19
|
作者
Li, Yifei [1 ]
Guo, Yijia [2 ]
Alazab, Mamoun [3 ]
Chen, Shengbo [1 ]
Shen, Cong [4 ]
Yu, Keping [5 ]
机构
[1] Henan Univ, Sch Comp & Informat Engn, Kaifeng 475001, Peoples R China
[2] Beihang Univ, Sch Automat Sci & Elect Engn, Beijing 100190, Peoples R China
[3] Charles Darwin Univ, Coll Engn IT & Environm, Casuarina, NT 0810, Australia
[4] Univ Virginia, Charles L Brown Dept Elect & Comp Engn, Charlottesville, VA 22904 USA
[5] Waseda Univ, Global Informat & Telecommun Inst, Shinjuku Ku, Tokyo 1698050, Japan
基金
中国国家自然科学基金; 日本学术振兴会;
关键词
Quantization (signal); Servers; Collaborative work; Optimization; Data models; Computational modeling; Standards; Artificial intelligence; vehicular ad hoc networks; federated learning; quantization; VEHICLES;
D O I
10.1109/TITS.2022.3145823
中图分类号
TU [建筑科学];
学科分类号
0813 ;
摘要
Vehicular ad hoc networks (VANETs) is one of the most promising approaches for the Intelligent Transportation Systems (ITS). With the rapid increase in the amount of traffic data, deep learning based algorithms have been used extensively in VANETs. The recently proposed federated learning is an attractive candidate for collaborative machine learning where instead of transferring a plethora of data to a centralized server, all clients train their respective local models and upload them to the server for model aggregation. Model quantization is an effective approach to address the communication efficiency issue in federated learning, and yet existing studies largely assume homogeneous quantization for all clients. However, in reality, clients are predominantly heterogeneous, where they support different quantization precision levels. In this work, we propose FedDO - Federated Learning with Double Optimization. Minimizing the drift term in the convergence analysis, which is a weighted sum of squared quantization errors (SQE) over all clients, leads to a double optimization at both clients and server sides. In particular, each client adopts a fully distributed, instantaneous (per learning round) and individualized (per client) quantization scheme that minimizes its own squared quantization error, and the server computes the aggregation weights that minimize the weighted sum of squared quantization errors over all clients. We show via numerical experiments that the minimal-SQE quantizer has a better performance than a widely adopted linear quantizer for federated learning. We also demonstrate the performance advantages of FedDO over the vanilla FedAvg with standard equal weights and linear quantization.
引用
收藏
页码:19852 / 19863
页数:12
相关论文
共 50 条
  • [21] Smart contract assisted secure aggregation scheme for model update in federated learning
    Wu, Caihong
    Liu, Jihua
    COMPUTER NETWORKS, 2024, 250
  • [22] An Improved Federated Learning-Assisted Data Aggregation Scheme for Smart Grids
    Pang, Bo
    Liang, Hui-Hui
    Zhang, Ling-Hao
    Teng, Yu-Fei
    Chang, Zheng-Wei
    Liu, Ze-Wei
    Hu, Chun-Qiang
    Mou, Wen-Hao
    APPLIED SCIENCES-BASEL, 2023, 13 (17):
  • [23] A Novel Blockchain-Assisted Aggregation Scheme for Federated Learning in IoT Networks
    Liu, Zhiming
    Zheng, Kan
    Hou, Lu
    Yang, Haojun
    Yang, Kan
    IEEE INTERNET OF THINGS JOURNAL, 2023, 10 (19): : 17544 - 17556
  • [24] An Efficient and Multi-Private Key Secure Aggregation Scheme for Federated Learning
    Yang, Xue
    Liu, Zifeng
    Tang, Xiaohu
    Lu, Rongxing
    Liu, Bo
    IEEE TRANSACTIONS ON SERVICES COMPUTING, 2024, 17 (05) : 1998 - 2011
  • [25] MHAT: An efficient model-heterogenous aggregation training scheme for federated learning
    Hu, Li
    Yan, Hongyang
    Li, Lang
    Pan, Zijie
    Liu, Xiaozhang
    Zhang, Zulong
    INFORMATION SCIENCES, 2021, 560 (560) : 493 - 503
  • [26] An effective and verifiable secure aggregation scheme with privacy-preserving for federated learning
    Wang, Rong
    Xiong, Ling
    Geng, Jiazhou
    Xie, Chun
    Li, Ruidong
    JOURNAL OF SYSTEMS ARCHITECTURE, 2025, 161
  • [27] Privacy-Preserving Data Aggregation Scheme Based on Federated Learning for IIoT
    Fan, Hongbin
    Zhou, Zhi
    MATHEMATICS, 2023, 11 (01)
  • [28] EPPDA: An Efficient Privacy-Preserving Data Aggregation Federated Learning Scheme
    Song, Jingcheng
    Wang, Weizheng
    Gadekallu, Thippa Reddy
    Cao, Jianyu
    Liu, Yining
    IEEE TRANSACTIONS ON NETWORK SCIENCE AND ENGINEERING, 2023, 10 (05): : 3047 - 3057
  • [29] Adaptive federated learning secure aggregation scheme based on threshold homomorphic encryption
    Ma Z.
    Jin J.
    Yang Y.
    Liu Y.
    Ying Z.
    Li T.
    Zhang J.
    Tongxin Xuebao/Journal on Communications, 2023, 44 (07): : 76 - 85
  • [30] Catch-Up: A Data Aggregation Scheme for VANETs
    Yu, Bo
    Gong, Jiayu
    Xu, Cheng-Zhong
    VANET'08: PROCEEDINGS OF THE FIFTH ACM INTERNATIONAL WORKSHOP ON VEHICULAR INTER-NETWORKING, 2008, : 49 - 57