HCFL: A High Compression Approach for Communication-Efficient Federated Learning in Very Large Scale IoT Networks

被引:13
|
作者
Nguyen, Minh-Duong [1 ]
Lee, Sang-Min [1 ]
Pham, Quoc-Viet [2 ]
Hoang, Dinh Thai [3 ]
Nguyen, Diep N. [3 ]
Hwang, Won-Joo [4 ]
机构
[1] Pusan Natl Univ, Dept Informat Convergence Engn, Pusan 46241, South Korea
[2] Pusan Natl Univ, Korean Southeast Ctr Ind Revolut Leader Educ 4, Pusan 46241, South Korea
[3] Univ Technol Sydney, Sch Elect & Data Engn, Sydney, NSW 2007, Australia
[4] Pusan Natl Univ, Dept Biomed Convergence Engn, Yangsan 50612, South Korea
基金
新加坡国家研究基金会; 澳大利亚研究理事会;
关键词
Autoencoder; communication efficiency; data compression; deep learning; distributed learning; federated learning; internet-of-things; machine type communication;
D O I
10.1109/TMC.2022.3190510
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Federated learning (FL) is a new artificial intelligence concept that enables Internet-of-Things (IoT) devices to learn a collaborative model without sending the raw data to centralized nodes for processing. Despite numerous advantages, low computing resources at IoT devices and high communication costs for exchanging model parameters make applications of FL in massive IoT networks very limited. In this work, we develop a novel compression scheme for FL, called high-compression federated learning (HCFL) , for very large scale IoT networks. HCFL can reduce the data load for FL processes without changing their structure and hyperparameters. In this way, we not only can significantly reduce communication costs, but also make intensive learning processes more adaptable on low-computing resource IoT devices. Furthermore, we investigate a relationship between the number of IoT devices and the convergence level of the FL model and thereby better assess the quality of the FL process. We demonstrate our HCFL scheme in both simulations and mathematical analyses. Our proposed theoretical research can be used as a minimum level of satisfaction, proving that the FL process can achieve good performance when a determined configuration is met. Therefore, we show that HCFL is applicable in any FL-integrated networks with numerous IoT devices.
引用
收藏
页码:6495 / 6507
页数:13
相关论文
共 50 条
  • [11] FedDQ: A communication-efficient federated learning approach for Internet of Vehicles
    Mo, Zijia
    Gao, Zhipeng
    Zhao, Chen
    Lin, Yijing
    Journal of Systems Architecture, 2022, 131
  • [12] FedDQ: A communication-efficient federated learning approach for Internet of Vehicles
    Mo, Zijia
    Gao, Zhipeng
    Zhao, Chen
    Lin, Yijing
    JOURNAL OF SYSTEMS ARCHITECTURE, 2022, 131
  • [13] Communication-efficient and privacy-preserving large-scale federated learning counteracting heterogeneity
    Zhou, Xingcai
    Yang, Guang
    INFORMATION SCIENCES, 2024, 661
  • [14] A Communication-Efficient Federated Learning Scheme for IoT-Based Traffic Forecasting
    Zhang, Chenhan
    Cui, Lei
    Yu, Shui
    Yu, James J. Q.
    IEEE INTERNET OF THINGS JOURNAL, 2021, 9 (14): : 11918 - 11931
  • [15] Communication-Efficient Personalized Federated Learning for Digital Twin in Heterogeneous Industrial IoT
    Wang, Zhihan
    Ma, Xiangxue
    Zhang, Haixia
    Yuan, Dongfeng
    2023 IEEE INTERNATIONAL CONFERENCE ON COMMUNICATIONS WORKSHOPS, ICC WORKSHOPS, 2023, : 237 - 241
  • [16] Communication-efficient hierarchical federated learning for IoT heterogeneous systems with imbalanced data
    Abdellatif, Alaa Awad
    Mhaisen, Naram
    Mohamed, Amr
    Erbad, Aiman
    Guizani, Mohsen
    Dawy, Zaher
    Nasreddine, Wassim
    FUTURE GENERATION COMPUTER SYSTEMS-THE INTERNATIONAL JOURNAL OF ESCIENCE, 2022, 128 : 406 - 419
  • [17] Communication-efficient Federated Learning for UAV Networks with Knowledge Distillation and Transfer Learning
    Li, Yalong
    Wu, Celimuge
    Du, Zhaoyang
    Zhong, Lei
    Yoshinaga, Tsutomu
    IEEE CONFERENCE ON GLOBAL COMMUNICATIONS, GLOBECOM, 2023, : 5739 - 5744
  • [18] Communication-Efficient Federated Learning via Regularized Sparse Random Networks
    Mestoukirdi, Mohamad
    Esrafilian, Omid
    Gesbert, David
    Li, Qianrui
    Gresset, Nicolas
    IEEE COMMUNICATIONS LETTERS, 2024, 28 (07) : 1574 - 1578
  • [19] Communication-Efficient Personalized Federated Meta-Learning in Edge Networks
    Yu, Feng
    Lin, Hui
    Wang, Xiaoding
    Garg, Sahil
    Kaddoum, Georges
    Singh, Satinder
    Hassan, Mohammad Mehedi
    IEEE TRANSACTIONS ON NETWORK AND SERVICE MANAGEMENT, 2023, 20 (02): : 1558 - 1571
  • [20] Communication-Efficient Federated Learning for Large-Scale Multiagent Systems in ISAC: Data Augmentation With Reinforcement Learning
    Ouyang, Wenjiang
    Liu, Qian
    Mu, Junsheng
    AI-Dulaimi, Anwer
    Jing, Xiaojun
    Liu, Qilie
    IEEE SYSTEMS JOURNAL, 2024, : 1893 - 1904