Communication-Efficient and Privacy-Aware Distributed Learning

被引:2
|
作者
Gogineni, Vinay Chakravarthi [1 ,2 ]
Moradi, Ashkan [3 ]
Venkategowda, Naveen K. D. [4 ]
Werner, Stefan [3 ,5 ]
机构
[1] Norwegian Univ Sci & Technol, N-7491 Trondheim, Norway
[2] Univ Southern Denmark, Maersk Mc Kinney Moller Inst, SDU Appl AI & Data Sci, DK-5230 Odense, Denmark
[3] Norwegian Univ Sci & Technol, Dept Elect Syst, N-7491 Trondheim, Norway
[4] Linkoping Univ, S-60174 Norrkoping, Sweden
[5] Aalto Univ, Dept Informat & Commun Engn, Espoo 00076, Finland
关键词
Privacy; Distance learning; Computer aided instruction; Heuristic algorithms; Information processing; Differential privacy; Convergence; Average consensus; communication efficiency; distributed learning; multiagent systems; privacy-preservation; NETWORKS; SECURE;
D O I
10.1109/TSIPN.2023.3322783
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Communication efficiency and privacy are two key concerns in modern distributed computing systems. Towards this goal, this article proposes partial sharing private distributed learning (PPDL) algorithms that offer communication efficiency while preserving privacy, thus making them suitable for applications with limited resources in adversarial environments. First, we propose a noise injection-based PPDL algorithm that achieves communication efficiency by sharing only a fraction of the information at each consensus iteration and provides privacy by perturbing the information exchanged among neighbors. To further increase privacy, local information is randomly decomposed into private and public substates before sharing with the neighbors. This results in a decomposition- and noise-injection-based PPDL strategy in which only a freaction of the perturbeesd public substate is shared during local collaborations, whereas the private substate is updated locally without being shared. To determine the impact of communication savings and privacy preservation on the performance of distributed learning algorithms, we analyze the mean and mean-square convergence of the proposed algorithms. Moreover, we investigate the privacy of agents by characterizing privacy as the mean squared error of the estimate of private information at the honest-but-curious adversary. The analytical results show a tradeoff between communication efficiency and privacy in proposed PPDL algorithms, while decomposition- and noise-injection-based PPDL improves privacy compared to noise-injection-based PPDL. Lastly, numerical simulations corroborate the analytical findings.
引用
收藏
页码:705 / 720
页数:16
相关论文
共 50 条
  • [21] Communication-Efficient and Byzantine-Robust Distributed Learning
    Ghosh, Avishek
    Maity, Raj Kumar
    Kadhe, Swanand
    Mazumdar, Arya
    Ramchandran, Kannan
    2020 INFORMATION THEORY AND APPLICATIONS WORKSHOP (ITA), 2020,
  • [22] Communication-efficient Distributed Learning for Large Batch Optimization
    Liu, Rui
    Mozafari, Barzan
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 162, 2022,
  • [23] Communication-Efficient Quantum Algorithm for Distributed Machine Learning
    Tang, Hao
    Li, Boning
    Wang, Guoqing
    Xu, Haowei
    Li, Changhao
    Barr, Ariel
    Cappellaro, Paola
    Li, Ju
    PHYSICAL REVIEW LETTERS, 2023, 130 (15)
  • [24] A Communication-Efficient Distributed Matrix Multiplication Scheme with Privacy, Security, and Resiliency
    Wang, Tao
    Shi, Zhiping
    Yang, Juan
    Liu, Sha
    ENTROPY, 2024, 26 (09)
  • [25] Communication-Efficient Personalized Federated Learning With Privacy-Preserving
    Wang, Qian
    Chen, Siguang
    Wu, Meng
    IEEE TRANSACTIONS ON NETWORK AND SERVICE MANAGEMENT, 2024, 21 (02): : 2374 - 2388
  • [26] Location Privacy-Aware and Energy-Efficient Offloading for Distributed Edge Computing
    He, Yulong
    He, Xiaofan
    Jin, Richeng
    Dai, Huaiyu
    IEEE TRANSACTIONS ON WIRELESS COMMUNICATIONS, 2023, 22 (11) : 7975 - 7988
  • [27] PRIVACY-AWARE DISTRIBUTED GRAPH-BASED SEMI-SUPERVISED LEARNING
    Guler, Basak
    Avesthnehr, A. Salman
    Ortega, Antonio
    2019 IEEE 29TH INTERNATIONAL WORKSHOP ON MACHINE LEARNING FOR SIGNAL PROCESSING (MLSP), 2019,
  • [28] Implementation of Secure and Privacy-aware AI Hardware using Distributed Federated Learning
    Ghimire, Ashutosh
    Asiri, Ahmad Nasser
    Hildebrand, Brian
    Amsaad, Fathi
    2023 IEEE 16TH DALLAS CIRCUITS AND SYSTEMS CONFERENCE, DCAS, 2023,
  • [29] Communication-Efficient Distributed Deep Metric Learning with Hybrid Synchronization
    Su, Yuxin
    Lyu, Michael
    King, Irwin
    CIKM'18: PROCEEDINGS OF THE 27TH ACM INTERNATIONAL CONFERENCE ON INFORMATION AND KNOWLEDGE MANAGEMENT, 2018, : 1463 - 1472
  • [30] Guest Editorial Communication-Efficient Distributed Learning Over Networks
    Cao, Xuanyu
    Basar, Tamer
    Diggavi, Suhas
    Eldar, Yonina C.
    Letaief, Khaled B.
    Poor, H. Vincent
    Zhang, Junshan
    IEEE JOURNAL ON SELECTED AREAS IN COMMUNICATIONS, 2023, 41 (04) : 845 - 850