Joint Optimal Quantization and Aggregation of Federated Learning Scheme in VANETs

被引:19
|
作者
Li, Yifei [1 ]
Guo, Yijia [2 ]
Alazab, Mamoun [3 ]
Chen, Shengbo [1 ]
Shen, Cong [4 ]
Yu, Keping [5 ]
机构
[1] Henan Univ, Sch Comp & Informat Engn, Kaifeng 475001, Peoples R China
[2] Beihang Univ, Sch Automat Sci & Elect Engn, Beijing 100190, Peoples R China
[3] Charles Darwin Univ, Coll Engn IT & Environm, Casuarina, NT 0810, Australia
[4] Univ Virginia, Charles L Brown Dept Elect & Comp Engn, Charlottesville, VA 22904 USA
[5] Waseda Univ, Global Informat & Telecommun Inst, Shinjuku Ku, Tokyo 1698050, Japan
基金
中国国家自然科学基金; 日本学术振兴会;
关键词
Quantization (signal); Servers; Collaborative work; Optimization; Data models; Computational modeling; Standards; Artificial intelligence; vehicular ad hoc networks; federated learning; quantization; VEHICLES;
D O I
10.1109/TITS.2022.3145823
中图分类号
TU [建筑科学];
学科分类号
0813 ;
摘要
Vehicular ad hoc networks (VANETs) is one of the most promising approaches for the Intelligent Transportation Systems (ITS). With the rapid increase in the amount of traffic data, deep learning based algorithms have been used extensively in VANETs. The recently proposed federated learning is an attractive candidate for collaborative machine learning where instead of transferring a plethora of data to a centralized server, all clients train their respective local models and upload them to the server for model aggregation. Model quantization is an effective approach to address the communication efficiency issue in federated learning, and yet existing studies largely assume homogeneous quantization for all clients. However, in reality, clients are predominantly heterogeneous, where they support different quantization precision levels. In this work, we propose FedDO - Federated Learning with Double Optimization. Minimizing the drift term in the convergence analysis, which is a weighted sum of squared quantization errors (SQE) over all clients, leads to a double optimization at both clients and server sides. In particular, each client adopts a fully distributed, instantaneous (per learning round) and individualized (per client) quantization scheme that minimizes its own squared quantization error, and the server computes the aggregation weights that minimize the weighted sum of squared quantization errors over all clients. We show via numerical experiments that the minimal-SQE quantizer has a better performance than a widely adopted linear quantizer for federated learning. We also demonstrate the performance advantages of FedDO over the vanilla FedAvg with standard equal weights and linear quantization.
引用
收藏
页码:19852 / 19863
页数:12
相关论文
共 50 条
  • [41] A Data Aggregation Scheme for Traffic Information Systems in Urban VANETs
    Guedes, Bruno F.
    Campos, Carlos A. V.
    2016 IEEE 19TH INTERNATIONAL CONFERENCE ON INTELLIGENT TRANSPORTATION SYSTEMS (ITSC), 2016, : 564 - 569
  • [42] Frequency Modulation Aggregation for Federated Learning
    Martinez-Gost, Marc
    Perez-Neira, Ana
    Lagunas, Miguel Angel
    IEEE CONFERENCE ON GLOBAL COMMUNICATIONS, GLOBECOM, 2023, : 1878 - 1883
  • [43] Robust Aggregation Function in Federated Learning
    Taheri, Rahim
    Arabikhan, Farzad
    Gegov, Alexander
    Akbari, Negar
    ADVANCES IN INFORMATION SYSTEMS, ARTIFICIAL INTELLIGENCE AND KNOWLEDGE MANAGEMENT, ICIKS 2023, 2024, 486 : 168 - 175
  • [44] Lazy Aggregation for Heterogeneous Federated Learning
    Xu, Gang
    Kong, De-Lun
    Chen, Xiu-Bo
    Liu, Xin
    APPLIED SCIENCES-BASEL, 2022, 12 (17):
  • [45] Adapted Weighted Aggregation in Federated Learning
    Tang, Yitong
    THIRTY-EIGTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, VOL 38 NO 21, 2024, : 23763 - 23765
  • [46] SwiftAgg plus : Achieving Asymptotically Optimal Communication Loads in Secure Aggregation for Federated Learning
    Jahani-Nezhad, Tayyebeh
    Maddah-Ali, Mohammad Ali
    Li, Songze
    Caire, Giuseppe
    IEEE JOURNAL ON SELECTED AREAS IN COMMUNICATIONS, 2023, 41 (04) : 977 - 989
  • [47] Federated Learning with Buffered Asynchronous Aggregation
    Nguyen, John
    Malik, Kshitiz
    Zhan, Hongyuan
    Yousefpour, Ashkan
    Rabbat, Michael
    Malek, Mani
    Huba, Dzmitry
    INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 151, 2022, 151
  • [48] Evaluation of Federated Learning Aggregation Algorithms
    Ek, Sannara
    Portet, Francois
    Lalanda, Philippe
    Vega, German
    UBICOMP/ISWC '20 ADJUNCT: PROCEEDINGS OF THE 2020 ACM INTERNATIONAL JOINT CONFERENCE ON PERVASIVE AND UBIQUITOUS COMPUTING AND PROCEEDINGS OF THE 2020 ACM INTERNATIONAL SYMPOSIUM ON WEARABLE COMPUTERS, 2020, : 638 - 643
  • [49] Towards Efficient Federated Learning: Layer-Wise Pruning-Quantization Scheme and Coding Design
    Zhu, Zheqi
    Shi, Yuchen
    Xin, Gangtao
    Peng, Chenghui
    Fan, Pingyi
    Letaief, Khaled B.
    ENTROPY, 2023, 25 (08)
  • [50] Deep Reinforcement Learning-based Quantization for Federated Learning
    Zheng, Sihui
    Dong, Yuhan
    Chen, Xiang
    2023 IEEE WIRELESS COMMUNICATIONS AND NETWORKING CONFERENCE, WCNC, 2023,