Communication-Efficient Personalized Federated Meta-Learning in Edge Networks

被引:10
|
作者
Yu, Feng [1 ,2 ]
Lin, Hui [1 ,2 ]
Wang, Xiaoding [1 ,2 ]
Garg, Sahil [3 ]
Kaddoum, Georges [3 ,4 ]
Singh, Satinder [5 ]
Hassan, Mohammad Mehedi [6 ]
机构
[1] Fujian Normal Univ, Coll Comp & Cyber Secur, Fuzhou 350117, Peoples R China
[2] Fujian Prov Univ, Engn Res Ctr Cyber Secur & Educ Informatizat, Fuzhou 350117, Fujian, Peoples R China
[3] Ecole Technol Super, Montreal, PQ H3C 1K3, Canada
[4] Lebanese Amer Univ, Cyber Secur Syst & Appl AI Res Ctr, Beirut, Lebanon
[5] Ultra Commun, Montreal, PQ H4T 1V7, Canada
[6] King Saud Univ, Coll Comp & Informat Sci, Dept Informat Syst, Riyadh 11543, Saudi Arabia
关键词
Edge networks; federated meta learning; representation learning; autoencoder; differential privacy;
D O I
10.1109/TNSM.2023.3263831
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Due to the privacy breach risks and data aggregation of traditional centralized machine learning (ML) approaches, applications, data and computing power are being pushed from centralized data centers to network edge nodes. Federated Learning (FL) is an emerging privacy-preserving distributed ML paradigm suitable for edge network applications, which is able to address the above two issues of traditional ML. However, the current FL methods cannot flexibly deal with the challenges of model personalization and communication overhead in the network applications. Inspired by the mixture of global and local models, we proposed a Communication-Efficient Personalized Federated Meta-Learning algorithm to obtain a novel personalized model by introducing the personalization parameter. We can improve model accuracy and accelerate its convergence by adjusting the size of the personalized parameter. Further, the local model to be uploaded is transformed into the latent space through autoencoder, thereby reducing the amount of communication data, and further reducing communication overhead. And local and task-global differential privacy are applied to provide privacy protection for model generation. Simulation experiments demonstrate that our method can obtain better personalized models at a lower communication overhead for edge network applications, while compared with several other algorithms.
引用
收藏
页码:1558 / 1571
页数:14
相关论文
共 50 条
  • [31] Communication-Efficient Federated Learning via Regularized Sparse Random Networks
    Mestoukirdi, Mohamad
    Esrafilian, Omid
    Gesbert, David
    Li, Qianrui
    Gresset, Nicolas
    IEEE COMMUNICATIONS LETTERS, 2024, 28 (07) : 1574 - 1578
  • [32] Communication-Efficient and Attack-Resistant Federated Edge Learning With Dataset Distillation
    Zhou, Yanlin
    Ma, Xiyao
    Wu, Dapeng
    Li, Xiaolin
    IEEE TRANSACTIONS ON CLOUD COMPUTING, 2023, 11 (03) : 2517 - 2528
  • [33] FlocOff: Data Heterogeneity Resilient Federated Learning With Communication-Efficient Edge Offloading
    Ma, Mulei
    Gong, Chenyu
    Zeng, Liekang
    Yang, Yang
    Wu, Liantao
    IEEE JOURNAL ON SELECTED AREAS IN COMMUNICATIONS, 2024, 42 (11) : 3262 - 3277
  • [34] Communication-efficient asynchronous federated learning in resource-constrained edge computing
    Liu, Jianchun
    Xu, Hongli
    Xu, Yang
    Ma, Zhenguo
    Wang, Zhiyuan
    Qian, Chen
    Huang, He
    COMPUTER NETWORKS, 2021, 199
  • [35] Communication-Efficient Federated Edge Learning via Optimal Probabilistic Device Scheduling
    Zhang, Maojun
    Zhu, Guangxu
    Wang, Shuai
    Jiang, Jiamo
    Liao, Qing
    Zhong, Caijun
    Cui, Shuguang
    IEEE TRANSACTIONS ON WIRELESS COMMUNICATIONS, 2022, 21 (10) : 8536 - 8551
  • [36] SHARE: Shaping Data Distribution at Edge for Communication-Efficient Hierarchical Federated Learning
    Deng, Yongheng
    Lyu, Feng
    Ren, Ju
    Zhang, Yongmin
    Zhou, Yuezhi
    Zhang, Yaoxue
    Yang, Yuanyuan
    2021 IEEE 41ST INTERNATIONAL CONFERENCE ON DISTRIBUTED COMPUTING SYSTEMS (ICDCS 2021), 2021, : 24 - 34
  • [37] HPFL-CN: Communication-Efficient Hierarchical Personalized Federated Edge Learning via Complex Network Feature Clustering
    Li, Zijian
    Chen, Zihan
    Wei, Xiaohui
    Gao, Shang
    Ren, Chenghao
    Quek, Tony Q. S.
    2022 19TH ANNUAL IEEE INTERNATIONAL CONFERENCE ON SENSING, COMMUNICATION, AND NETWORKING (SECON), 2022, : 325 - 333
  • [38] Communication-Efficient Federated Learning with Heterogeneous Devices
    Chen, Zhixiong
    Yi, Wenqiang
    Liu, Yuanwei
    Nallanathan, Arumugam
    ICC 2023-IEEE INTERNATIONAL CONFERENCE ON COMMUNICATIONS, 2023, : 3602 - 3607
  • [39] Communication-Efficient Federated Learning for Decision Trees
    Zhao, Shuo
    Zhu, Zikun
    Li, Xin
    Chen, Ying-Chi
    IEEE Transactions on Artificial Intelligence, 2024, 5 (11): : 5478 - 5492
  • [40] Joint Model Pruning and Device Selection for Communication-Efficient Federated Edge Learning
    Liu, Shengli
    Yu, Guanding
    Yin, Rui
    Yuan, Jiantao
    Shen, Lei
    Liu, Chonghe
    IEEE TRANSACTIONS ON COMMUNICATIONS, 2022, 70 (01) : 231 - 244