Semi-Federated Learning: Convergence Analysis and Optimization of a Hybrid Learning Framework

被引:9
|
作者
Zheng, Jingheng [1 ]
Ni, Wanli [1 ]
Tian, Hui [1 ]
Gunduz, Deniz [2 ]
Quek, Tony Q. S. [3 ,4 ]
Han, Zhu [5 ,6 ]
机构
[1] Beijing Univ Posts & Telecommun, State Key Lab Networking & Switching Technol, Beijing 100876, Peoples R China
[2] Imperial Coll London, Dept Elect & Elect Engn, London SW7 2AZ, England
[3] Singapore Univ Technol & Design, Pillar Informat Syst Technologyand Design, Singapore 487372, Singapore
[4] Kyung Hee Univ, Dept Elect Engn, Yongin 17104, South Korea
[5] Univ Houston, Dept Elect & Comp Engn, Houston, TX 77004 USA
[6] Kyung Hee Univ, Dept Comp Sci & Engn, Seoul 446701, South Korea
关键词
Convergence; Computational modeling; Transceivers; Training; NOMA; Data models; Privacy; Semi-federated learning; communication efficiency; convergence analysis; transceiver design; RESOURCE-ALLOCATION; COMMUNICATION-EFFICIENT; MIMO-NOMA; COMPUTATION; MINIMIZATION; DESIGN;
D O I
10.1109/TWC.2023.3270908
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Under the organization of the base station (BS), wireless federated learning (FL) enables collaborative model training among multiple devices. However, the BS is merely responsible for aggregating local updates during the training process, which incurs a waste of the computational resources at the BS. To tackle this issue, we propose a semi-federated learning (SemiFL) paradigm to leverage the computing capabilities of both the BS and devices for a hybrid implementation of centralized learning (CL) and FL. Specifically, each device sends both local gradients and data samples to the BS for training a shared global model. To improve communication efficiency over the same time-frequency resources, we integrate over-the-air computation for aggregation and non-orthogonal multiple access for transmission by designing a novel transceiver structure. To gain deep insights, we conduct convergence analysis by deriving a closed-form optimality gap for SemiFL and extend the result to two extra cases. In the first case, the BS uses all accumulated data samples to calculate the CL gradient, while a decreasing learning rate is adopted in the second case. Our analytical results capture the destructive effect of wireless communication and show that both FL and CL are special cases of SemiFL. Then, we formulate a non-convex problem to reduce the optimality gap by jointly optimizing the transmit power and receive beamformers. Accordingly, we propose a two-stage algorithm to solve this intractable problem, in which we provide closed-form solutions to the beamformers. Extensive simulation results on two real-world datasets corroborate our theoretical analysis, and show that the proposed SemiFL outperforms conventional FL and achieves 3.2% accuracy gain on the MNIST dataset compared to state-of-the-art benchmarks.
引用
收藏
页码:9438 / 9456
页数:19
相关论文
共 50 条
  • [21] PHFL: a federated learning framework based on a hybrid mechanism
    Chen, Xingyu
    Chen, Yuxiang
    Liang, Wei
    He, Dacheng
    Li, Kuanching
    Ivanovic, Mirjana
    JOURNAL OF SUPERCOMPUTING, 2025, 81 (05):
  • [22] Convergence Time Optimization for Federated Learning Over Wireless Networks
    Chen, Mingzhe
    Poor, H. Vincent
    Saad, Walid
    Cui, Shuguang
    IEEE TRANSACTIONS ON WIRELESS COMMUNICATIONS, 2021, 20 (04) : 2457 - 2471
  • [23] CSAFL: A Clustered Semi-Asynchronous Federated Learning Framework
    Zhang, Yu
    Duan, Morning
    Liu, Duo
    Li, Li
    Ren, Ao
    Chen, Xianzhang
    Tan, Yujuan
    Wang, Chengliang
    2021 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2021,
  • [24] Convergence Analysis and Optimization of SWIPT-Based Over-the-Air Federated Learning
    Fan, Shaoshuai
    Tao, Shilin
    Ni, Wanli
    Tian, Hui
    IEEE COMMUNICATIONS LETTERS, 2024, 28 (06) : 1352 - 1356
  • [25] Vertical Federated Learning Over Cloud-RAN: Convergence Analysis and System Optimization
    Shi, Yuanming
    Xia, Shuhao
    Zhou, Yong
    Mao, Yijie
    Jiang, Chunxiao
    Tao, Meixia
    IEEE TRANSACTIONS ON WIRELESS COMMUNICATIONS, 2024, 23 (02) : 1327 - 1342
  • [26] Convergence Analysis for Wireless Federated Learning with Gradient Recycling
    Chen, Zhixiong
    Yi, Wenqiang
    Liu, Yuanwei
    Nallanathan, Arumugam
    2023 INTERNATIONAL WIRELESS COMMUNICATIONS AND MOBILE COMPUTING, IWCMC, 2023, : 1232 - 1237
  • [27] A Hybrid Semi-Asynchronous Federated Learning and Split Learning Strategy in Edge Networks
    Singh, Neha
    Adhikari, Mainak
    IEEE TRANSACTIONS ON NETWORK SCIENCE AND ENGINEERING, 2025, 12 (02): : 1429 - 1439
  • [28] Convergence Analysis of Sequential Federated Learning on Heterogeneous Data
    Li, Yipeng
    Lyu, Xinchen
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36 (NEURIPS 2023), 2023,
  • [29] A Decentralized Federated Learning Framework via Committee Mechanism With Convergence Guarantee
    Che, Chunjiang
    Li, Xiaoli
    Chen, Chuan
    He, Xiaoyu
    Zheng, Zibin
    IEEE TRANSACTIONS ON PARALLEL AND DISTRIBUTED SYSTEMS, 2022, 33 (12) : 4783 - 4800
  • [30] Mobility Accelerates Learning: Convergence Analysis on Hierarchical Federated Learning in Vehicular Networks
    Chen, Tan
    Yan, Jintao
    Sun, Yuxuan
    Zhou, Sheng
    Gunduz, Deniz
    Niu, Zhisheng
    IEEE TRANSACTIONS ON VEHICULAR TECHNOLOGY, 2025, 74 (01) : 1657 - 1673