Communication-Efficient Federated Double Distillation in IoV

被引:3
|
作者
Yang, Peng [1 ,2 ]
Yan, Mengjiao [1 ,2 ]
Cui, Yaping [1 ,2 ]
He, Peng [1 ,2 ]
Wu, Dapeng [1 ,2 ]
Wang, Ruyan [1 ,2 ]
Chen, Luo [1 ,2 ]
机构
[1] Chongqing Univ Posts & Telecommun, Sch Commun & Informat Engn, Adv Network & Intelligent Connect Technol, Key Lab Chongqing Educ Commiss China, Chongqing 400065, Peoples R China
[2] Chongqing Univ Posts & Telecommun, Chongqing Key Lab Ubiquitous Sensing & Networking, Chongqing 400065, Peoples R China
关键词
Training; Servers; Federated learning; Data models; Computational modeling; Vehicle dynamics; Technological innovation; Vehicular edge computing; federated learning; communication efficiency; knowledge distillation; SELECTION; INTERNET;
D O I
10.1109/TCCN.2023.3286665
中图分类号
TN [电子技术、通信技术];
学科分类号
0809 ;
摘要
6G will be the next horizon from connected people and things to intelligence-of-everything. Machine learning (ML) combines artificial intelligence with vehicular edge computing (VEC), further releasing the potential of data. Federated learning (FL) is used to manage the collaborative training of vehicular clients and the road side unit (RSU), which can efficiently complete the distributed data processing. However, the vehicular clients upload a large number of model parameters to occupy regular communication resources, and the stability of the link cannot be guaranteed due to the mobility of the vehicles, which may lead to expensive communication overhead in VEC. Thus, this paper proposes a communication-efficient federated double distillation (FedDD) framework, which comprehensively considers the three-dimensional attributes of the vehicles and dynamically selects the cluster-heads (CHs) to improve the transmission efficiency. Then, knowledge distillation is further integrated into the federated learning to accomplish multiple rounds of parameter distillation, thereby significantly reducing the communication overhead. Experimental results show that compared to traditional FedAvg, FedDD can reduce communication overhead by three orders of magnitude. The communication overhead of FedDD is reduced by 82% compared to FTTQ, which improves the communication efficiency of FL while sacrificing a small amount of accuracy.
引用
收藏
页码:1340 / 1352
页数:13
相关论文
共 50 条
  • [31] FedHe: Heterogeneous Models and Communication-Efficient Federated Learning
    Chan, Yun Hin
    Ngai, Edith C. H.
    2021 17TH INTERNATIONAL CONFERENCE ON MOBILITY, SENSING AND NETWORKING (MSN 2021), 2021, : 207 - 214
  • [32] FedADP: Communication-Efficient by Model Pruning for Federated Learning
    Liu, Haiyang
    Shi, Yuliang
    Su, Zhiyuan
    Zhang, Kun
    Wang, Xinjun
    Yan, Zhongmin
    Kong, Fanyu
    IEEE CONFERENCE ON GLOBAL COMMUNICATIONS, GLOBECOM, 2023, : 3093 - 3098
  • [33] Communication-Efficient Robust Federated Learning with Noisy Labels
    Li, Junyi
    Pei, Jian
    Huang, Heng
    PROCEEDINGS OF THE 28TH ACM SIGKDD CONFERENCE ON KNOWLEDGE DISCOVERY AND DATA MINING, KDD 2022, 2022, : 914 - 924
  • [34] Communication-Efficient Federated Learning via Predictive Coding
    Yue, Kai
    Jin, Richeng
    Wong, Chau-Wai
    Dai, Huaiyu
    IEEE JOURNAL OF SELECTED TOPICS IN SIGNAL PROCESSING, 2022, 16 (03) : 369 - 380
  • [35] Communication-Efficient Federated Learning With Data and Client Heterogeneity
    Zakerinia, Hossein
    Talaei, Shayan
    Nadiradze, Giorgi
    Alistarh, Dan
    INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 238, 2024, 238
  • [36] Communication-Efficient Wireless Traffic Prediction with Federated Learning
    Gao, Fuwei
    Zhang, Chuanting
    Qiao, Jingping
    Li, Kaiqiang
    Cao, Yi
    MATHEMATICS, 2024, 12 (16)
  • [37] Communication-Efficient Consensus Mechanism for Federated Reinforcement Learning
    Xu, Xing
    Li, Rongpeng
    Zhao, Zhifeng
    Zhang, Honggang
    IEEE INTERNATIONAL CONFERENCE ON COMMUNICATIONS (ICC 2022), 2022, : 80 - 85
  • [38] Communication-efficient Federated Learning with Cooperative Filter Selection
    Yang, Zhao
    Sun, Qingshuang
    2022 IEEE INTERNATIONAL SYMPOSIUM ON CIRCUITS AND SYSTEMS (ISCAS 22), 2022, : 2172 - 2176
  • [39] Communication-Efficient Federated Learning With Binary Neural Networks
    Yang, Yuzhi
    Zhang, Zhaoyang
    Yang, Qianqian
    IEEE JOURNAL ON SELECTED AREAS IN COMMUNICATIONS, 2021, 39 (12) : 3836 - 3850
  • [40] Communication-Efficient Federated Learning with Adaptive Consensus ADMM
    He, Siyi
    Zheng, Jiali
    Feng, Minyu
    Chen, Yixin
    APPLIED SCIENCES-BASEL, 2023, 13 (09):