Simultaneous Learning Knowledge Distillation for Image Restoration: Efficient Model Compression for Drones

被引:0
|
作者
Zhang, Yongheng [1 ]
机构
[1] Beijing Univ Posts & Telecommun, Sch Comp, 10 Xitucheng Rd, Beijing 100876, Peoples R China
关键词
knowledge distillation; model compression; drone-view image restoration; QUALITY ASSESSMENT;
D O I
10.3390/drones9030209
中图分类号
TP7 [遥感技术];
学科分类号
081102 ; 0816 ; 081602 ; 083002 ; 1404 ;
摘要
Deploying high-performance image restoration models on drones is critical for applications like autonomous navigation, surveillance, and environmental monitoring. However, the computational and memory limitations of drones pose significant challenges to utilizing complex image restoration models in real-world scenarios. To address this issue, we propose the Simultaneous Learning Knowledge Distillation (SLKD) framework, specifically designed to compress image restoration models for resource-constrained drones. SLKD introduces a dual-teacher, single-student architecture that integrates two complementary learning strategies: Degradation Removal Learning (DRL) and Image Reconstruction Learning (IRL). In DRL, the student encoder learns to eliminate degradation factors by mimicking Teacher A, which processes degraded images utilizing a BRISQUE-based extractor to capture degradation-sensitive natural scene statistics. Concurrently, in IRL, the student decoder reconstructs clean images by learning from Teacher B, which processes clean images, guided by a PIQE-based extractor that emphasizes the preservation of edge and texture features essential for high-quality reconstruction. This dual-teacher approach enables the student model to learn from both degraded and clean images simultaneously, achieving robust image restoration while significantly reducing computational complexity. Experimental evaluations across five benchmark datasets and three restoration tasks-deraining, deblurring, and dehazing-demonstrate that, compared to the teacher models, the SLKD student models achieve an average reduction of 85.4% in FLOPs and 85.8% in model parameters, with only a slight average decrease of 2.6% in PSNR and 0.9% in SSIM. These results highlight the practicality of integrating SLKD-compressed models into autonomous systems, offering efficient and real-time image restoration for aerial platforms operating in challenging environments.
引用
收藏
页数:23
相关论文
共 50 条
  • [1] Energy-Efficient Federated Knowledge Distillation Learning in Internet of Drones
    Cal, Semih
    Sun, Xiang
    Yao, Jingjing
    2024 IEEE INTERNATIONAL CONFERENCE ON COMMUNICATIONS WORKSHOPS, ICC WORKSHOPS 2024, 2024, : 1256 - 1261
  • [2] Model Compression Algorithm via Reinforcement Learning and Knowledge Distillation
    Liu, Botao
    Hu, Bing-Bing
    Zhao, Ming
    Peng, Sheng-Lung
    Chang, Jou-Ming
    Tsoulos, Ioannis G.
    MATHEMATICS, 2023, 11 (22)
  • [3] Efficient and Controllable Model Compression through Sequential Knowledge Distillation and Pruning
    Malihi, Leila
    Heidemann, Gunther
    BIG DATA AND COGNITIVE COMPUTING, 2023, 7 (03)
  • [4] Resource Allocation for Federated Knowledge Distillation Learning in Internet of Drones
    Yao, Jingjing
    Cal, Semih
    Sun, Xiang
    IEEE INTERNET OF THINGS JOURNAL, 2025, 12 (07): : 8064 - 8074
  • [5] Knowledge Distillation Beyond Model Compression
    Sarfraz, Fahad
    Arani, Elahe
    Zonooz, Bahram
    2020 25TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION (ICPR), 2021, : 6136 - 6143
  • [6] Distortion Disentanglement and Knowledge Distillation for Satellite Image Restoration
    Kandula, Praveen
    Rajagopalan, A. N.
    IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, 2022, 60
  • [7] Joint structured pruning and dense knowledge distillation for efficient transformer model compression
    Cui, Baiyun
    Li, Yingming
    Zhang, Zhongfei
    NEUROCOMPUTING, 2021, 458 : 56 - 69
  • [8] IMAGE COMPRESSION AND RESTORATION INCORPORATING PRIOR KNOWLEDGE
    HALL, TJ
    DARLING, AM
    FIDDY, MA
    OPTICS LETTERS, 1982, 7 (10) : 467 - 468
  • [9] Mitigating carbon footprint for knowledge distillation based deep learning model compression
    Rafat, Kazi
    Islam, Sadia
    Mahfug, Abdullah Al
    Hossain, Md. Ismail
    Rahman, Fuad
    Momen, Sifat
    Rahman, Shafin
    Mohammed, Nabeel
    PLOS ONE, 2023, 18 (05):
  • [10] AUGMENTING KNOWLEDGE DISTILLATION WITH PEER-TO-PEER MUTUAL LEARNING FOR MODEL COMPRESSION
    Niyaz, Usma
    Bathula, Deepti R.
    2022 IEEE INTERNATIONAL SYMPOSIUM ON BIOMEDICAL IMAGING (IEEE ISBI 2022), 2022,