Communication-efficient and Scalable Decentralized Federated Edge Learning

被引:0
|
作者
Yapp, Austine Zong Han [1 ]
Koh, Hong Soo Nicholas [1 ]
Lai, Yan Ting [1 ]
Kang, Jiawen [1 ]
Li, Xuandi [1 ]
Ng, Jer Shyuan [2 ]
Jiang, Hongchao [2 ]
Lim, Wei Yang Bryan [2 ]
Xiong, Zehui [3 ]
Niyato, Dusit [1 ]
机构
[1] Nanyang Technol Univ NTU, Sch Comp Sci & Engn, Singapore, Singapore
[2] Nanyang Technol Univ, Alibaba NTU Singapore Joint Res Inst JRI, Singapore, Singapore
[3] Singapore Univ Technol & Design SUTD, Singapore, Singapore
基金
新加坡国家研究基金会;
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Federated Edge Learning (FEL) is a distributed Machine Learning (ML) framework for collaborative training on edge devices. FEL improves data privacy over traditional centralized ML model training by keeping data on the devices and only sending local model updates to a central coordinator for aggregation. However, challenges still remain in existing FEL architectures where there is high communication overhead between edge devices and the coordinator. In this paper, we present a working prototype of blockchain-empowered and communication-efficient FEL framework, which enhances the security and scalability towards large-scale implementation of FEL.
引用
收藏
页码:5032 / 5035
页数:4
相关论文
共 50 条
  • [21] Communication-Efficient Federated Learning for Digital Twin Edge Networks in Industrial IoT
    Lu, Yunlong
    Huang, Xiaohong
    Zhang, Ke
    Maharjan, Sabita
    Zhang, Yan
    IEEE TRANSACTIONS ON INDUSTRIAL INFORMATICS, 2021, 17 (08) : 5709 - 5718
  • [22] DisPFL: Towards Communication-Efficient Personalized Federated Learning via Decentralized Sparse Training
    Dai, Rong
    Shen, Li
    He, Fengxiang
    Tian, Xinmei
    Tao, Dacheng
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 162, 2022,
  • [23] Communication-Efficient Federated Learning with Heterogeneous Devices
    Chen, Zhixiong
    Yi, Wenqiang
    Liu, Yuanwei
    Nallanathan, Arumugam
    ICC 2023-IEEE INTERNATIONAL CONFERENCE ON COMMUNICATIONS, 2023, : 3602 - 3607
  • [24] Communication-Efficient Federated Learning for Decision Trees
    Zhao, Shuo
    Zhu, Zikun
    Li, Xin
    Chen, Ying-Chi
    IEEE Transactions on Artificial Intelligence, 2024, 5 (11): : 5478 - 5492
  • [25] A Decentralized Communication-Efficient Federated Analytics Framework for Connected Vehicles
    Zhao, Liang
    Valero, Maria
    Pouriyeh, Seyedamin
    Li, Fangyu
    Guo, Lulu
    Han, Zhu
    IEEE TRANSACTIONS ON VEHICULAR TECHNOLOGY, 2024, 73 (07) : 10856 - 10861
  • [26] Communication-Efficient Federated Learning and Permissioned Blockchain for Digital Twin Edge Networks
    Lu, Yunlong
    Huang, Xiaohong
    Zhang, Ke
    Maharjan, Sabita
    Zhang, Yan
    IEEE INTERNET OF THINGS JOURNAL, 2021, 8 (04) : 2276 - 2288
  • [27] Joint Model Pruning and Device Selection for Communication-Efficient Federated Edge Learning
    Liu, Shengli
    Yu, Guanding
    Yin, Rui
    Yuan, Jiantao
    Shen, Lei
    Liu, Chonghe
    IEEE TRANSACTIONS ON COMMUNICATIONS, 2022, 70 (01) : 231 - 244
  • [28] Communication-Efficient Federated Learning with Adaptive Quantization
    Mao, Yuzhu
    Zhao, Zihao
    Yan, Guangfeng
    Liu, Yang
    Lan, Tian
    Song, Linqi
    Ding, Wenbo
    ACM TRANSACTIONS ON INTELLIGENT SYSTEMS AND TECHNOLOGY, 2022, 13 (04)
  • [29] FedBoost: Communication-Efficient Algorithms for Federated Learning
    Hamer, Jenny
    Mohri, Mehryar
    Suresh, Ananda Theertha
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 119, 2020, 119
  • [30] Communication-Efficient Secure Aggregation for Federated Learning
    Ergun, Irem
    Sami, Hasin Us
    Guler, Basak
    2022 IEEE GLOBAL COMMUNICATIONS CONFERENCE (GLOBECOM 2022), 2022, : 3881 - 3886