Communication-Efficient Federated Double Distillation in IoV

被引:3
|
作者
Yang, Peng [1 ,2 ]
Yan, Mengjiao [1 ,2 ]
Cui, Yaping [1 ,2 ]
He, Peng [1 ,2 ]
Wu, Dapeng [1 ,2 ]
Wang, Ruyan [1 ,2 ]
Chen, Luo [1 ,2 ]
机构
[1] Chongqing Univ Posts & Telecommun, Sch Commun & Informat Engn, Adv Network & Intelligent Connect Technol, Key Lab Chongqing Educ Commiss China, Chongqing 400065, Peoples R China
[2] Chongqing Univ Posts & Telecommun, Chongqing Key Lab Ubiquitous Sensing & Networking, Chongqing 400065, Peoples R China
关键词
Training; Servers; Federated learning; Data models; Computational modeling; Vehicle dynamics; Technological innovation; Vehicular edge computing; federated learning; communication efficiency; knowledge distillation; SELECTION; INTERNET;
D O I
10.1109/TCCN.2023.3286665
中图分类号
TN [电子技术、通信技术];
学科分类号
0809 ;
摘要
6G will be the next horizon from connected people and things to intelligence-of-everything. Machine learning (ML) combines artificial intelligence with vehicular edge computing (VEC), further releasing the potential of data. Federated learning (FL) is used to manage the collaborative training of vehicular clients and the road side unit (RSU), which can efficiently complete the distributed data processing. However, the vehicular clients upload a large number of model parameters to occupy regular communication resources, and the stability of the link cannot be guaranteed due to the mobility of the vehicles, which may lead to expensive communication overhead in VEC. Thus, this paper proposes a communication-efficient federated double distillation (FedDD) framework, which comprehensively considers the three-dimensional attributes of the vehicles and dynamically selects the cluster-heads (CHs) to improve the transmission efficiency. Then, knowledge distillation is further integrated into the federated learning to accomplish multiple rounds of parameter distillation, thereby significantly reducing the communication overhead. Experimental results show that compared to traditional FedAvg, FedDD can reduce communication overhead by three orders of magnitude. The communication overhead of FedDD is reduced by 82% compared to FTTQ, which improves the communication efficiency of FL while sacrificing a small amount of accuracy.
引用
收藏
页码:1340 / 1352
页数:13
相关论文
共 50 条
  • [21] Joint Knowledge Distillation and Local Differential Privacy for Communication-Efficient Federated Learning in Heterogeneous Systems
    Gad, Gad
    Fadlullah, Zubair Md
    Fouda, Mostafa M.
    Ibrahem, Mohamed I.
    Nasser, Nidal
    IEEE CONFERENCE ON GLOBAL COMMUNICATIONS, GLOBECOM, 2023, : 2051 - 2056
  • [22] Communication-Efficient Secure Aggregation for Federated Learning
    Ergun, Irem
    Sami, Hasin Us
    Guler, Basak
    2022 IEEE GLOBAL COMMUNICATIONS CONFERENCE (GLOBECOM 2022), 2022, : 3881 - 3886
  • [23] FedBoost: Communication-Efficient Algorithms for Federated Learning
    Hamer, Jenny
    Mohri, Mehryar
    Suresh, Ananda Theertha
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 119, 2020, 119
  • [24] FedAGL: A Communication-Efficient Federated Vehicular Network
    Liu, Su
    Li, Yushuai
    Guan, Peiyuan
    Li, Tianyi
    Yu, Jiong
    Taherkordi, Amir
    Jensen, Christian S.
    IEEE TRANSACTIONS ON INTELLIGENT VEHICLES, 2024, 9 (02): : 3704 - 3720
  • [25] Ternary Compression for Communication-Efficient Federated Learning
    Xu, Jinjin
    Du, Wenli
    Jin, Yaochu
    He, Wangli
    Cheng, Ran
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2022, 33 (03) : 1162 - 1176
  • [26] Federated Learning with Autotuned Communication-Efficient Secure Aggregation
    Bonawitz, Keith
    Salehi, Fariborz
    Konecny, Jakub
    McMahan, Brendan
    Gruteser, Marco
    CONFERENCE RECORD OF THE 2019 FIFTY-THIRD ASILOMAR CONFERENCE ON SIGNALS, SYSTEMS & COMPUTERS, 2019, : 1222 - 1226
  • [27] On the Design of Communication-Efficient Federated Learning for Health Monitoring
    Chu, Dong
    Jaafar, Wael
    Yanikomeroglu, Halim
    2022 IEEE GLOBAL COMMUNICATIONS CONFERENCE (GLOBECOM 2022), 2022, : 1128 - 1133
  • [28] ALS Algorithm for Robust and Communication-Efficient Federated Learning
    Hurley, Neil
    Duriakova, Erika
    Geraci, James
    O'Reilly-Morgan, Diarmuid
    Tragos, Elias
    Smyth, Barry
    Lawlor, Aonghus
    PROCEEDINGS OF THE 2024 4TH WORKSHOP ON MACHINE LEARNING AND SYSTEMS, EUROMLSYS 2024, 2024, : 56 - 64
  • [29] Communication-Efficient Federated Learning For Massive MIMO Systems
    Mu, Yuchen
    Garg, Navneet
    Ratnarajah, Tharmalingam
    2022 IEEE WIRELESS COMMUNICATIONS AND NETWORKING CONFERENCE (WCNC), 2022, : 578 - 583
  • [30] Communication-Efficient Design for Quantized Decentralized Federated Learning
    Chen, Li
    Liu, Wei
    Chen, Yunfei
    Wang, Weidong
    IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2024, 72 : 1175 - 1188