Ensemble Distillation Based Adaptive Quantization for Supporting Federated Learning in Wireless Networks

被引:5
|
作者
Liu, Yi-Jing [1 ,2 ]
Feng, Gang [1 ,2 ]
Niyato, Dusit [3 ]
Qin, Shuang [1 ,2 ]
Zhou, Jianhong [4 ]
Li, Xiaoqian [1 ,2 ]
Xu, Xinyi [1 ,2 ]
机构
[1] Univ Elect Sci & Technol China, Natl Key Lab Commun, Chengdu 611731, Peoples R China
[2] Univ Elect Sci & Technol China, Yangtze Delta Reg Inst Huzhou, Huzhou 313001, Peoples R China
[3] Nanyang Technol Univ, Sch Comp Sci & Engn, Singapore 639798, Singapore
[4] Xihua Univ, Sch Comp & Software Engn, Chengdu 610039, Peoples R China
基金
美国国家科学基金会;
关键词
Federated learning; wireless network; adaptive quantization; ensemble distillation; heterogeneous models; AGGREGATION;
D O I
10.1109/TWC.2022.3222717
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Federated learning (FL) has become a promising technique for developing intelligent wireless networks. In traditional FL paradigms, local models are usually required to be homogeneous for aggregation. However, due to heterogeneous models coming with wireless sysTem heterogeneity, it is preferable for user equipments (UEs) to undertake appropriate amount of computing and/or data transmission work based on sysTem constraints. Meanwhile, considerable communication costs are incurred by model training, when a large number of UEs participate in FL and/or the transmitted models are large. Therefore, resource-efficient training schemes for heterogeneous models are essential for enabling FL-based intelligent wireless networks. In this paper, we propose an adaptive quantization scheme based on ensemble distillation (AQeD), to facilitate heterogeneous model training. We first partition and group the participating UEs into clusters, where the local models in specific clusters are homogeneous with different quantization levels. Then we propose an augmented loss function by jointly considering ensemble distillation loss, quantization levels and wireless resources constraints. In AQeD, model aggregations are performed at two levels: model aggregation for individual clusters and distillation loss aggregation for cluster ensembles. Numerical results show that the AQeD scheme can significantly reduce communication costs and training time in comparison with some state-of-the-art solutions.
引用
收藏
页码:4013 / 4027
页数:15
相关论文
共 50 条
  • [21] Federated Learning in Heterogeneous Wireless Networks With Adaptive Mixing Aggregation and Computation Reduction
    Li, Jingxin
    Liu, Xiaolan
    Mahmoodi, Toktam
    IEEE OPEN JOURNAL OF THE COMMUNICATIONS SOCIETY, 2024, 5 : 2164 - 2182
  • [22] Reputation-Based Federated Learning for Secure Wireless Networks
    Song, Zhendong
    Sun, Hongguang
    Yang, Howard H.
    Wang, Xijun
    Zhang, Yan
    Quek, Tony Q. S.
    IEEE INTERNET OF THINGS JOURNAL, 2022, 9 (02) : 1212 - 1226
  • [23] Adaptive Federated Pruning in Hierarchical Wireless Networks
    Liu, Xiaonan
    Wang, Shiqiang
    Deng, Yansha
    Nallanathan, Arumugam
    IEEE TRANSACTIONS ON WIRELESS COMMUNICATIONS, 2024, 23 (06) : 5985 - 5999
  • [24] Effective Intrusion Detection in Heterogeneous Internet-of-Things Networks via Ensemble Knowledge Distillation-based Federated Learning
    Shen, Jiyuan
    Yang, Wenzhuo
    Chu, Zhaowei
    Fan, Jiani
    Niyato, Dusit
    Lam, Kwok-Yan
    ICC 2024 - IEEE INTERNATIONAL CONFERENCE ON COMMUNICATIONS, 2024, : 2034 - 2039
  • [25] Latency-Efficient Wireless Federated Learning With Quantization and Scheduling
    Yan, Zhigang
    Li, Dong
    Yu, Xianhua
    Zhang, Zhichao
    IEEE COMMUNICATIONS LETTERS, 2022, 26 (11) : 2621 - 2625
  • [26] Distributed adaptive quantization for wireless sensor networks
    Fang, Jun
    Li, Hongbin
    CONFERENCE RECORD OF THE FORTY-FIRST ASILOMAR CONFERENCE ON SIGNALS, SYSTEMS & COMPUTERS, VOLS 1-5, 2007, : 1372 - 1376
  • [27] Smart algorithm in wireless networks for video streaming based on adaptive quantization
    Taha, Miran
    Ali, Aree
    CONCURRENCY AND COMPUTATION-PRACTICE & EXPERIENCE, 2023, 35 (09):
  • [28] Adaptive Modulation for Wireless Federated Edge Learning
    Xu, Xinyi
    Yu, Guanding
    Liu, Shengli
    IEEE TRANSACTIONS ON COGNITIVE COMMUNICATIONS AND NETWORKING, 2023, 9 (04) : 1096 - 1109
  • [29] Adaptive Network Pruning for Wireless Federated Learning
    Liu, Shengli
    Yu, Guanding
    Yin, Rui
    Yuan, Jiantao
    IEEE WIRELESS COMMUNICATIONS LETTERS, 2021, 10 (07) : 1572 - 1576
  • [30] Wireless Federated Distillation for Distributed Edge Learning with Heterogeneous Data
    Ahn, Jin-Hyun
    Simeone, Osvaldo
    Kang, Joonhyuk
    2019 IEEE 30TH ANNUAL INTERNATIONAL SYMPOSIUM ON PERSONAL, INDOOR AND MOBILE RADIO COMMUNICATIONS (PIMRC), 2019, : 1138 - 1143