Federated Feature Concatenate Method for Heterogeneous Computing in Federated Learning

被引:2
|
作者
Chung, Wu -Chun [1 ]
Chang, Yung -Chin [1 ]
Hsu, Ching-Hsien [2 ,3 ]
Chang, Chih-Hung [4 ]
Hung, Che-Lun [4 ,5 ]
机构
[1] Chung Yuan Christian Univ, Dept Informat & Comp Engn, Taoyuan, Taiwan
[2] Asia Univ, Dept Comp Sci & Informat Engn, Taichung, Taiwan
[3] China Med Univ, China Med Univ Hosp, Dept Med Res, Taichung, Taiwan
[4] Providence Univ, Dept Comp Sci & Commun Engn, Taichung, Taiwan
[5] Natl Yang Ming Chiao Tung Univ, Inst Biomed Informat, Taipei, Taiwan
来源
CMC-COMPUTERS MATERIALS & CONTINUA | 2023年 / 75卷 / 01期
关键词
Federated learning; deep learning; artificial intelligence; heterogeneous computing; COMMUNICATION;
D O I
10.32604/cmc.2023.035720
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Federated learning is an emerging machine learning technique that enables clients to collaboratively train a deep learning model without uploading raw data to the aggregation server. Each client may be equipped with different computing resources for model training. The client equipped with a lower computing capability requires more time for model training, resulting in a prolonged training time in federated learning. Moreover, it may fail to train the entire model because of the out-of-memory issue. This study aims to tackle these problems and propose the federated feature concatenate (FedFC) method for federated learning considering heterogeneous clients. FedFC leverages the model splitting and feature concatenate for offloading a portion of the training loads from clients to the aggregation server. Each client in FedFC can collaboratively train a model with different cutting layers. Therefore, the specific features learned in the deeper layer of the server -side model are more identical for the data class classification. Accordingly, FedFC can reduce the computation loading for the resource-constrained client and accelerate the convergence time. The performance effectiveness is verified by considering different dataset scenarios, such as data and class imbalance for the participant clients in the experiments. The performance impacts of different cutting layers are evaluated during the model training. The experimental results show that the co-adapted features have a critical impact on the adequate classification of the deep learning model. Overall, FedFC not only shortens the convergence time, but also improves the best accuracy by up to 5.9% and 14.5% when compared to conventional federated learning and splitfed, respectively. In conclusion, the proposed approach is feasible and effective for heterogeneous clients in federated learning.
引用
收藏
页码:351 / 371
页数:21
相关论文
共 50 条
  • [31] Semi-Federated Learning for Connected Intelligence With Computing-Heterogeneous Devices
    Han, Jiachen
    Ni, Wanli
    Li, Li
    IEEE INTERNET OF THINGS JOURNAL, 2024, 11 (21): : 34078 - 34092
  • [32] Hier-FUN: Hierarchical Federated Learning and Unlearning in Heterogeneous Edge Computing
    Ma, Zhenguo
    Tu, Huaqing
    Zhou, Li
    Ji, Pengli
    Yan, Xiaoran
    Xu, Hongli
    Wang, Zhiyuan
    Chen, Suo
    IEEE INTERNET OF THINGS JOURNAL, 2025, 12 (07): : 8653 - 8668
  • [33] FedFA: Federated Learning With Feature Anchors to Align Features and Classifiers for Heterogeneous Data
    Zhou, Tailin
    Zhang, Jun
    Tsang, Danny H. K.
    IEEE TRANSACTIONS ON MOBILE COMPUTING, 2024, 23 (06) : 6731 - 6742
  • [34] FedMP: Federated Learning through Adaptive Model Pruning in Heterogeneous Edge Computing
    Jiang, Zhida
    Xu, Yang
    Xu, Hongli
    Wang, Zhiyuan
    Qiao, Chunming
    Zhao, Yangming
    2022 IEEE 38TH INTERNATIONAL CONFERENCE ON DATA ENGINEERING (ICDE 2022), 2022, : 767 - 779
  • [35] CoopFL: Accelerating federated learning with DNN partitioning and offloading in heterogeneous edge computing
    Wang, Zhiyuan
    Xu, Hongli
    Xu, Yang
    Jiang, Zhida
    Liu, Jianchun
    COMPUTER NETWORKS, 2023, 220
  • [36] FedSA: A Semi-Asynchronous Federated Learning Mechanism in Heterogeneous Edge Computing
    Ma, Qianpiao
    Xu, Yang
    Xu, Hongli
    Jiang, Zhida
    Huang, Liusheng
    Huang, He
    IEEE JOURNAL ON SELECTED AREAS IN COMMUNICATIONS, 2021, 39 (12) : 3654 - 3672
  • [37] CFL-HC: A Coded Federated Learning Framework for Heterogeneous Computing Scenarios
    Wang, Dong
    Wang, Baoqian
    Zhang, Jinran
    Lu, Kejie
    Xie, Junfei
    Wan, Yan
    Fu, Shengli
    2021 IEEE GLOBAL COMMUNICATIONS CONFERENCE (GLOBECOM), 2021,
  • [38] PrVFL: Pruning-Aware Verifiable Federated Learning for Heterogeneous Edge Computing
    Wang, Xigui
    Yu, Haiyang
    Chen, Yuwen
    Sinnott, Richard O.
    Yang, Zhen
    IEEE TRANSACTIONS ON MOBILE COMPUTING, 2024, 23 (12) : 15062 - 15079
  • [39] MultimodalHD: Federated Learning Over Heterogeneous Sensor Modalities using Hyperdimensional Computing
    Zhao, Quanling
    Yu, Xiaofan
    Hu, Shengfan
    Rosing, Tajana
    2024 DESIGN, AUTOMATION & TEST IN EUROPE CONFERENCE & EXHIBITION, DATE, 2024,
  • [40] Edge computing privacy protection method based on blockchain and federated learning
    Fang C.
    Guo Y.
    Wang Y.
    Hu Y.
    Ma J.
    Zhang H.
    Hu Y.
    Tongxin Xuebao/Journal on Communications, 2021, 42 (11): : 28 - 40