Federated Feature Concatenate Method for Heterogeneous Computing in Federated Learning

被引:2
|
作者
Chung, Wu -Chun [1 ]
Chang, Yung -Chin [1 ]
Hsu, Ching-Hsien [2 ,3 ]
Chang, Chih-Hung [4 ]
Hung, Che-Lun [4 ,5 ]
机构
[1] Chung Yuan Christian Univ, Dept Informat & Comp Engn, Taoyuan, Taiwan
[2] Asia Univ, Dept Comp Sci & Informat Engn, Taichung, Taiwan
[3] China Med Univ, China Med Univ Hosp, Dept Med Res, Taichung, Taiwan
[4] Providence Univ, Dept Comp Sci & Commun Engn, Taichung, Taiwan
[5] Natl Yang Ming Chiao Tung Univ, Inst Biomed Informat, Taipei, Taiwan
来源
CMC-COMPUTERS MATERIALS & CONTINUA | 2023年 / 75卷 / 01期
关键词
Federated learning; deep learning; artificial intelligence; heterogeneous computing; COMMUNICATION;
D O I
10.32604/cmc.2023.035720
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Federated learning is an emerging machine learning technique that enables clients to collaboratively train a deep learning model without uploading raw data to the aggregation server. Each client may be equipped with different computing resources for model training. The client equipped with a lower computing capability requires more time for model training, resulting in a prolonged training time in federated learning. Moreover, it may fail to train the entire model because of the out-of-memory issue. This study aims to tackle these problems and propose the federated feature concatenate (FedFC) method for federated learning considering heterogeneous clients. FedFC leverages the model splitting and feature concatenate for offloading a portion of the training loads from clients to the aggregation server. Each client in FedFC can collaboratively train a model with different cutting layers. Therefore, the specific features learned in the deeper layer of the server -side model are more identical for the data class classification. Accordingly, FedFC can reduce the computation loading for the resource-constrained client and accelerate the convergence time. The performance effectiveness is verified by considering different dataset scenarios, such as data and class imbalance for the participant clients in the experiments. The performance impacts of different cutting layers are evaluated during the model training. The experimental results show that the co-adapted features have a critical impact on the adequate classification of the deep learning model. Overall, FedFC not only shortens the convergence time, but also improves the best accuracy by up to 5.9% and 14.5% when compared to conventional federated learning and splitfed, respectively. In conclusion, the proposed approach is feasible and effective for heterogeneous clients in federated learning.
引用
收藏
页码:351 / 371
页数:21
相关论文
共 50 条
  • [1] Federated Deep Learning for Heterogeneous Edge Computing
    Ahmed, Khandaker Mamun
    Imteaj, Ahmed
    Amini, M. Hadi
    20TH IEEE INTERNATIONAL CONFERENCE ON MACHINE LEARNING AND APPLICATIONS (ICMLA 2021), 2021, : 1146 - 1152
  • [2] Accelerating Decentralized Federated Learning in Heterogeneous Edge Computing
    Wang, Lun
    Xu, Yang
    Xu, Hongli
    Chen, Min
    Huang, Liusheng
    IEEE TRANSACTIONS ON MOBILE COMPUTING, 2023, 22 (09) : 5001 - 5016
  • [3] Adaptive Clustered Federated Learning for Heterogeneous Data in Edge Computing
    Biyao Gong
    Tianzhang Xing
    Zhidan Liu
    Junfeng Wang
    Xiuya Liu
    Mobile Networks and Applications, 2022, 27 : 1520 - 1530
  • [4] Adaptive Clustered Federated Learning for Heterogeneous Data in Edge Computing
    Gong, Biyao
    Xing, Tianzhang
    Liu, Zhidan
    Wang, Junfeng
    Liu, Xiuya
    Mobile Networks and Applications, 2022, 27 (04): : 1520 - 1530
  • [5] Adaptive Clustered Federated Learning for Heterogeneous Data in Edge Computing
    Gong, Biyao
    Xing, Tianzhang
    Liu, Zhidan
    Wang, Junfeng
    Liu, Xiuya
    MOBILE NETWORKS & APPLICATIONS, 2022, 27 (04): : 1520 - 1530
  • [6] Allosteric Feature Collaboration for Model-Heterogeneous Federated Learning
    Yang, Baoyao
    Yuen, Pong C.
    Zhang, Yiqun
    Zeng, An
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2023, : 1 - 15
  • [7] Federated Learning with Heterogeneous Quantization
    Shen, Cong
    Chen, Shengbo
    2020 IEEE/ACM SYMPOSIUM ON EDGE COMPUTING (SEC 2020), 2020, : 405 - 409
  • [8] Federated Feature Selection for Horizontal Federated Learning in IoT Networks
    Zhang, Xunzheng
    Mavromatis, Alex
    Vafeas, Antonis
    Nejabati, Reza
    Simeonidou, Dimitra
    IEEE INTERNET OF THINGS JOURNAL, 2023, 10 (11) : 10095 - 10112
  • [9] BOSE: Block-Wise Federated Learning in Heterogeneous Edge Computing
    Wang, Lun
    Xu, Yang
    Xu, Hongli
    Jiang, Zhida
    Chen, Min
    Zhang, Wuyang
    Qian, Chen
    IEEE-ACM TRANSACTIONS ON NETWORKING, 2024, 32 (02) : 1362 - 1377
  • [10] Customized Federated Learning for accelerated edge computing with heterogeneous task targets
    Jiang, Hui
    Liu, Min
    Yang, Bo
    Liu, Qingxiang
    Li, Jizhong
    Guo, Xiaobing
    COMPUTER NETWORKS, 2020, 183