A communication-efficient distributed deep learning remote sensing image change detection framework

被引:2
|
作者
Cheng, Hongquan [1 ,2 ]
Zheng, Jie [2 ]
Wu, Huayi [2 ]
Qi, Kunlun [3 ]
He, Lihua [4 ]
机构
[1] Guangdong Univ Technol, Sch Architecture & Urban Planning, Guangzhou, Peoples R China
[2] State Key Lab Informat Engn Surveying, Mapping & Remote Sensing LIESMARS, Wuhan, Peoples R China
[3] China Univ Geosci Wuhan, Sch Geog & Informat Engn, Wuhan, Peoples R China
[4] Hubei Prov Geog Natl Condit Monitoring Ctr, Wuhan, Peoples R China
基金
中国国家自然科学基金;
关键词
Change detection; Distributed deep learning; Parallel computing; Communication compression; Staleness compensation; METAANALYSIS; NETWORK;
D O I
10.1016/j.jag.2024.103840
中图分类号
TP7 [遥感技术];
学科分类号
081102 ; 0816 ; 081602 ; 083002 ; 1404 ;
摘要
With the introduction of deep learning methods, the computation required for remote sensing change detection has significantly increased, and distributed computing is applied to remote sensing change detection to improve computational efficiency. However, due to the large size of deep learning models, the time-consuming gradient transfer during distributed model training weakens the acceleration effectiveness in change detection. Data communication and updates can be the bottlenecks in distributed change detection systems with limited network resources. To address the interrelated problems, we propose a communication -efficient distributed deep learning remote sensing change detection framework (CEDD-CD) based on the synchronous update architecture. The CEDD-CD integrates change detection with communication -efficient distributed gradient compression approaches, which can efficiently reduce the data volume to be transferred. In addition, for the implicit effect caused by the delay of compressed gradient update, a momentum compensation mechanism under theoretical analysis was constructed to reduce the time consumption required for model convergence and strengthen the stability of distributed training. We also designed a unified distributed change detection system architecture to reduce the complexity of distributed modeling. Experiments were conducted on three datasets; the qualitative and quantitative results demonstrate that the CEDD-CD was effective for massive remote sensing image change detection.
引用
收藏
页数:13
相关论文
共 50 条
  • [21] Communication-Efficient Distributed Cooperative Learning With Compressed Beliefs
    Toghani, Mohammad Taha
    Uribe, Cesar A.
    IEEE TRANSACTIONS ON CONTROL OF NETWORK SYSTEMS, 2022, 9 (03): : 1215 - 1226
  • [22] Communication-Efficient and Resilient Distributed Q-Learning
    Xie, Yijing
    Mou, Shaoshuai
    Sundaram, Shreyas
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2024, 35 (03) : 3351 - 3364
  • [23] Communication-Efficient Distributed Learning of Discrete Probability Distributions
    Diakonikolas, Ilias
    Grigorescu, Elena
    Li, Jerry
    Natarajan, Abhiram
    Onak, Krzysztof
    Schmidt, Ludwig
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 30 (NIPS 2017), 2017, 30
  • [24] Local Stochastic ADMM for Communication-Efficient Distributed Learning
    ben Issaid, Chaouki
    Elgabli, Anis
    Bennis, Mehdi
    2022 IEEE WIRELESS COMMUNICATIONS AND NETWORKING CONFERENCE (WCNC), 2022, : 1880 - 1885
  • [25] Communication-Efficient and Privacy-Aware Distributed Learning
    Gogineni, Vinay Chakravarthi
    Moradi, Ashkan
    Venkategowda, Naveen K. D.
    Werner, Stefan
    IEEE TRANSACTIONS ON SIGNAL AND INFORMATION PROCESSING OVER NETWORKS, 2023, 9 : 705 - 720
  • [26] Ordered Gradient Approach for Communication-Efficient Distributed Learning
    Chen, Yicheng
    Sadler, Brian M.
    Blum, Rick S.
    PROCEEDINGS OF THE 21ST IEEE INTERNATIONAL WORKSHOP ON SIGNAL PROCESSING ADVANCES IN WIRELESS COMMUNICATIONS (IEEE SPAWC2020), 2020,
  • [27] Communication-Efficient and Byzantine-Robust Distributed Learning
    Ghosh, Avishek
    Maity, Raj Kumar
    Kadhe, Swanand
    Mazumdar, Arya
    Ramchandran, Kannan
    2020 INFORMATION THEORY AND APPLICATIONS WORKSHOP (ITA), 2020,
  • [28] Communication-efficient Distributed Learning for Large Batch Optimization
    Liu, Rui
    Mozafari, Barzan
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 162, 2022,
  • [29] Communication-Efficient Quantum Algorithm for Distributed Machine Learning
    Tang, Hao
    Li, Boning
    Wang, Guoqing
    Xu, Haowei
    Li, Changhao
    Barr, Ariel
    Cappellaro, Paola
    Li, Ju
    PHYSICAL REVIEW LETTERS, 2023, 130 (15)
  • [30] Communication-Efficient Federated Learning Based on Compressed Sensing
    Li, Chengxi
    Li, Gang
    Varshney, Pramod K.
    IEEE INTERNET OF THINGS JOURNAL, 2021, 8 (20) : 15531 - 15541