Tensor-Enabled Communication-Efficient and Trustworthy Federated Learning for Heterogeneous Intelligent Space-Air-Ground-Integrated IoT

被引:4
|
作者
Zhao, Ruonan [1 ]
Yang, Laurence T. [1 ,2 ,3 ]
Liu, Debin [4 ]
Lu, Wanli [1 ]
机构
[1] Huazhong Univ Sci & Technol, Hubei Engn Res Ctr Big Data Secur, Sch Cyber Sci & Engn, Hubei Key Lab Distributed Syst Secur, Wuhan 430074, Peoples R China
[2] Hainan Univ, Sch Comp Sci & Technol, Haikou 570228, Peoples R China
[3] St Francis Xavier Univ, Dept Comp Sci, Antigonish, NS B2G 2W5, Canada
[4] Huazhong Univ Sci & Technol, Sch Comp Sci & Technol, Wuhan 430074, Peoples R China
关键词
Computational modeling; Tensors; Data models; Training; Security; Internet of Things; Adaptation models; Adaptivity; communication efficiency; federated learning (FL); heterogeneous clients; model security;
D O I
10.1109/JIOT.2023.3283853
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Federated learning (FL) could provide a promising privacy-preserving intelligent learning paradigm for space-air-ground-integrated Internet of Things (SAGI-IoT) by breaking down data islands and solving the dilemma between data privacy and data sharing. Currently, adaptivity, communication efficiency and model security are the three main challenges faced by FL, and they are rarely considered by existing works simultaneously. Concretely, most existing FL works assume that local models share the same architecture with the global model, which is less adaptive and cannot meet the heterogeneous requirements of SAGI-IoT. Exchanging numerous model parameters not only generates massive communication overhead but also poses the risk of privacy leakage. The security of FL based on homomorphic encryption with a single private key is weak as well. Given this, this article proposes a tensor-empowered communication-efficient and trustworthy heterogeneous FL, where various participants could choose suitable heterogeneous local models according to their actual computing and communication environment, so that clients with different capabilities could do what they are good at. Additionally, tensor train decomposition is leveraged to reduce communication parameters while maintaining model performance. The storage requirements and communication overhead for heterogeneous clients are reduced further. Finally, the homomorphic encryption with double trapdoor property is utilized to provide a robust and trustworthy environment, which can defend against the inference attacks from malicious external attackers, honest-but-curious server and internal participating clients. Extensive experimental results show that the proposed approach is more adaptive and can improve communication efficiency as well as protect model security compared with the state-of-the-art.
引用
收藏
页码:20285 / 20296
页数:12
相关论文
共 50 条
  • [21] Massive Digital Over-the-Air Computation for Communication-Efficient Federated Edge Learning
    Qiao, Li
    Gao, Zhen
    Mashhadi, Mahdi Boloursaz
    Gunduz, Deniz
    IEEE JOURNAL ON SELECTED AREAS IN COMMUNICATIONS, 2024, 42 (11) : 3078 - 3094
  • [22] Communication-Efficient Device Scheduling via Over-the-Air Computation for Federated Learning
    Jiang, Bingqing
    Du, Jun
    Jiang, Chunxiao
    Shi, Yuanming
    Han, Zhu
    2022 IEEE GLOBAL COMMUNICATIONS CONFERENCE (GLOBECOM 2022), 2022, : 173 - 178
  • [23] Space-Air-Ground-Sea Integrated Network with Federated Learning
    Zhao, Hao
    Ji, Fei
    Wang, Yan
    Yao, Kexing
    Chen, Fangjiong
    REMOTE SENSING, 2024, 16 (09)
  • [24] Communication-Efficient and Model-Heterogeneous Personalized Federated Learning via Clustered Knowledge Transfer
    Cho, Yae Jee
    Wang, Jianyu
    Chirvolu, Tarun
    Joshi, Gauri
    IEEE JOURNAL OF SELECTED TOPICS IN SIGNAL PROCESSING, 2023, 17 (01) : 234 - 247
  • [25] Joint Knowledge Distillation and Local Differential Privacy for Communication-Efficient Federated Learning in Heterogeneous Systems
    Gad, Gad
    Fadlullah, Zubair Md
    Fouda, Mostafa M.
    Ibrahem, Mohamed I.
    Nasser, Nidal
    IEEE CONFERENCE ON GLOBAL COMMUNICATIONS, GLOBECOM, 2023, : 2051 - 2056
  • [26] Communication-Efficient Federated Learning With Adaptive Aggregation for Heterogeneous Client-Edge-Cloud Network
    Luo, Long
    Zhang, Chi
    Yu, Hongfang
    Sun, Gang
    Luo, Shouxi
    Dustdar, Schahram
    IEEE TRANSACTIONS ON SERVICES COMPUTING, 2024, 17 (06) : 3241 - 3255
  • [27] AGQFL: Communication-efficient Federated Learning via Automatic Gradient Quantization in Edge Heterogeneous Systems
    Lian, Zirui
    Cao, Jing
    Zuo, Yanru
    Liu, Weihong
    Zhu, Zongwei
    2021 IEEE 39TH INTERNATIONAL CONFERENCE ON COMPUTER DESIGN (ICCD 2021), 2021, : 551 - 558
  • [28] Federated Learning for Intelligent Transmission with Space-Air-Ground Integrated Network toward 6G
    Tang, Fengxiao
    Wen, Cong
    Chen, Xuehan
    Kato, Nei
    IEEE NETWORK, 2023, 37 (02): : 198 - 204
  • [29] Communication-Efficient Federated Learning in UAV-enabled IoV: A Joint Auction-Coalition Approach
    Ng, Jer Shyuan
    Lim, Wei Yang Bryan
    Dai, Hong-Ning
    Xiong, Zehui
    Huang, Jianqiang
    Niyato, Dusit
    Hua, Xian-Sheng
    Leung, Cyril
    Miao, Chunyan
    2020 IEEE GLOBAL COMMUNICATIONS CONFERENCE (GLOBECOM), 2020,
  • [30] HCFL: A High Compression Approach for Communication-Efficient Federated Learning in Very Large Scale IoT Networks
    Nguyen, Minh-Duong
    Lee, Sang-Min
    Pham, Quoc-Viet
    Hoang, Dinh Thai
    Nguyen, Diep N.
    Hwang, Won-Joo
    IEEE TRANSACTIONS ON MOBILE COMPUTING, 2023, 22 (11) : 6495 - 6507