Ensemble Distillation Based Adaptive Quantization for Supporting Federated Learning in Wireless Networks

被引:5
|
作者
Liu, Yi-Jing [1 ,2 ]
Feng, Gang [1 ,2 ]
Niyato, Dusit [3 ]
Qin, Shuang [1 ,2 ]
Zhou, Jianhong [4 ]
Li, Xiaoqian [1 ,2 ]
Xu, Xinyi [1 ,2 ]
机构
[1] Univ Elect Sci & Technol China, Natl Key Lab Commun, Chengdu 611731, Peoples R China
[2] Univ Elect Sci & Technol China, Yangtze Delta Reg Inst Huzhou, Huzhou 313001, Peoples R China
[3] Nanyang Technol Univ, Sch Comp Sci & Engn, Singapore 639798, Singapore
[4] Xihua Univ, Sch Comp & Software Engn, Chengdu 610039, Peoples R China
基金
美国国家科学基金会;
关键词
Federated learning; wireless network; adaptive quantization; ensemble distillation; heterogeneous models; AGGREGATION;
D O I
10.1109/TWC.2022.3222717
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Federated learning (FL) has become a promising technique for developing intelligent wireless networks. In traditional FL paradigms, local models are usually required to be homogeneous for aggregation. However, due to heterogeneous models coming with wireless sysTem heterogeneity, it is preferable for user equipments (UEs) to undertake appropriate amount of computing and/or data transmission work based on sysTem constraints. Meanwhile, considerable communication costs are incurred by model training, when a large number of UEs participate in FL and/or the transmitted models are large. Therefore, resource-efficient training schemes for heterogeneous models are essential for enabling FL-based intelligent wireless networks. In this paper, we propose an adaptive quantization scheme based on ensemble distillation (AQeD), to facilitate heterogeneous model training. We first partition and group the participating UEs into clusters, where the local models in specific clusters are homogeneous with different quantization levels. Then we propose an augmented loss function by jointly considering ensemble distillation loss, quantization levels and wireless resources constraints. In AQeD, model aggregations are performed at two levels: model aggregation for individual clusters and distillation loss aggregation for cluster ensembles. Numerical results show that the AQeD scheme can significantly reduce communication costs and training time in comparison with some state-of-the-art solutions.
引用
收藏
页码:4013 / 4027
页数:15
相关论文
共 50 条
  • [1] Resource Consumption for Supporting Federated Learning in Wireless Networks
    Liu, Yi-Jing
    Qin, Shuang
    Sun, Yao
    Feng, Gang
    IEEE TRANSACTIONS ON WIRELESS COMMUNICATIONS, 2022, 21 (11) : 9974 - 9989
  • [2] Adaptive Hierarchical Federated Learning Over Wireless Networks
    Xu, Bo
    Xia, Wenchao
    Wen, Wanli
    Liu, Pei
    Zhao, Haitao
    Zhu, Hongbo
    IEEE TRANSACTIONS ON VEHICULAR TECHNOLOGY, 2022, 71 (02) : 2070 - 2083
  • [3] Ensemble Federated Learning With Non-IID Data in Wireless Networks
    Zhao, Zhongyuan
    Wang, Jingyi
    Hong, Wei
    Quek, Tony Q. S.
    Ding, Zhiguo
    Peng, Mugen
    IEEE TRANSACTIONS ON WIRELESS COMMUNICATIONS, 2024, 23 (04) : 3557 - 3571
  • [4] Adaptive Transmission Scheduling in Wireless Networks for Asynchronous Federated Learning
    Lee, Hyun-Suk
    Lee, Jang-Won
    IEEE JOURNAL ON SELECTED AREAS IN COMMUNICATIONS, 2021, 39 (12) : 3673 - 3687
  • [5] Adaptive Quantization based on Ensemble Distillation to Support FL enabled Edge Intelligence
    Liu, Yi-Jing
    Qin, Shuang
    Feng, Gang
    Niyato, Dusit
    Sun, Yao
    Zhou, Jianhong
    2022 IEEE GLOBAL COMMUNICATIONS CONFERENCE (GLOBECOM 2022), 2022, : 2194 - 2199
  • [6] Quantization Bits Allocation for Wireless Federated Learning
    Lan, Muhang
    Ling, Qing
    Xiao, Song
    Zhang, Wenyi
    IEEE TRANSACTIONS ON WIRELESS COMMUNICATIONS, 2023, 22 (11) : 8336 - 8351
  • [7] Adaptive Quantization Mechanism for Federated Learning Models Based on DAG Blockchain
    Li, Tong
    Yang, Chao
    Wang, Lei
    Li, Tingting
    Zhao, Hai
    Chen, Jiewei
    ELECTRONICS, 2023, 12 (17)
  • [8] Adaptive Model Pruning and Personalization for Federated Learning Over Wireless Networks
    Liu, Xiaonan
    Ratnarajah, Tharmalingam
    Sellathurai, Mathini
    Eldar, Yonina C.
    IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2024, 72 : 4395 - 4411
  • [9] Adaptive Semi-Asynchronous Federated Learning Over Wireless Networks
    Chen, Zhixiong
    Yi, Wenqiang
    Shin, Hyundong
    Nallanathan, Arumugam
    IEEE TRANSACTIONS ON COMMUNICATIONS, 2025, 73 (01) : 394 - 409
  • [10] FedTKD: A Trustworthy Heterogeneous Federated Learning Based on Adaptive Knowledge Distillation
    Chen, Leiming
    Zhang, Weishan
    Dong, Cihao
    Zhao, Dehai
    Zeng, Xingjie
    Qiao, Sibo
    Zhu, Yichang
    Tan, Chee Wei
    ENTROPY, 2024, 26 (01)