Small models, big impact: A review on the power of lightweight Federated Learning

被引:4
|
作者
Qi, Pian [1 ]
Chiaro, Diletta [1 ]
Piccialli, Francesco [1 ]
机构
[1] Univ Naples Federico II, Dept Math & Applicat R Caccioppoli, Naples, Italy
关键词
Federated Learning; Device heterogeneity; Constrained devices; Lightweight federated learning; Tiny federated learning; Data availability; INTERNET; FUTURE;
D O I
10.1016/j.future.2024.107484
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
Federated Learning (FL) enhances Artificial Intelligence (AI) applications by enabling individual devices to collaboratively learn shared models without uploading local data with third parties, thereby preserving privacy. However, implementing FL in real-world scenarios presents numerous challenges, especially with IoT devices with limited memory, diverse communication conditions, and varying computational capabilities. The research community is turning to lightweight FL, the new solutions that optimize FL training, inference, and deployment to work efficiently on IoT devices. This paper reviews lightweight FL, systematically organizing and summarizing the related techniques based on its workflow. Finally, we indicate potential problems in this area and suggest future directions to provide valuable insights into the field.
引用
收藏
页数:15
相关论文
共 50 条
  • [31] Advancements in Federated Learning: Models, Methods, and
    Chen, Huiming
    Wang, Huandong
    Long, Qingyue
    Jin, Depeng
    Li, Yong
    ACM COMPUTING SURVEYS, 2025, 57 (02)
  • [32] Pretrained Models for Multilingual Federated Learning
    Weller, Orion
    Marone, Marc
    Braverman, Vladimir
    Lawrie, Dawn
    Van Durme, Benjamin
    NAACL 2022: THE 2022 CONFERENCE OF THE NORTH AMERICAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS: HUMAN LANGUAGE TECHNOLOGIES, 2022, : 1413 - 1421
  • [33] Distributed and deep vertical federated learning with big data
    Liu, Ji
    Zhou, Xuehai
    Mo, Lei
    Ji, Shilei
    Liao, Yuan
    Li, Zheng
    Gu, Qin
    Dou, Dejing
    CONCURRENCY AND COMPUTATION-PRACTICE & EXPERIENCE, 2023, 35 (21):
  • [34] FLight: A lightweight federated learning framework in edge and fog computing
    Zhu, Wuji
    Goudarzi, Mohammad
    Buyya, Rajkumar
    SOFTWARE-PRACTICE & EXPERIENCE, 2024, 54 (05): : 813 - 841
  • [35] LSFL: A Lightweight and Secure Federated Learning Scheme for Edge Computing
    Zhang, Zhuangzhuang
    Wu, Libing
    Ma, Chuanguo
    Li, Jianxin
    Wang, Jing
    Wang, Qian
    Yu, Shui
    IEEE TRANSACTIONS ON INFORMATION FORENSICS AND SECURITY, 2023, 18 : 365 - 379
  • [36] Transmit Power Control for Indoor Small Cells: A Method Based on Federated Reinforcement Learning
    Li, Peizheng
    Erdol, Hakan
    Briggs, Keith
    Wang, Xiaoyang
    Piechocki, Robert
    Ahmad, Abdelrahim
    Inacio, Rui
    Kapoor, Shipra
    Doufexi, Angela
    Parekh, Arjun
    2022 IEEE 96TH VEHICULAR TECHNOLOGY CONFERENCE (VTC2022-FALL), 2022,
  • [37] Small Models for Big Data
    Mistry, Hitesh B.
    Orrell, David
    CLINICAL PHARMACOLOGY & THERAPEUTICS, 2020, 107 (04) : 710 - 711
  • [38] The Impact of Federated Learning on Urban Computing
    Souza, Jose R. F.
    Oliveira, Sheridan Z. L. N.
    Oliveira, Helder
    JOURNAL OF INTERNET SERVICES AND APPLICATIONS, 2024, 15 (01) : 380 - 409
  • [39] Big power in a small package
    Motion, Danaher
    Motion System Design, 2002, 44 (03): : 19 - 22
  • [40] SMALL PARTICLES ARE BIG ON POWER
    不详
    NEW SCIENTIST, 1983, 97 (1341) : 160 - 160