Simultaneous Learning Knowledge Distillation for Image Restoration: Efficient Model Compression for Drones

被引:0
|
作者
Zhang, Yongheng [1 ]
机构
[1] Beijing Univ Posts & Telecommun, Sch Comp, 10 Xitucheng Rd, Beijing 100876, Peoples R China
关键词
knowledge distillation; model compression; drone-view image restoration; QUALITY ASSESSMENT;
D O I
10.3390/drones9030209
中图分类号
TP7 [遥感技术];
学科分类号
081102 ; 0816 ; 081602 ; 083002 ; 1404 ;
摘要
Deploying high-performance image restoration models on drones is critical for applications like autonomous navigation, surveillance, and environmental monitoring. However, the computational and memory limitations of drones pose significant challenges to utilizing complex image restoration models in real-world scenarios. To address this issue, we propose the Simultaneous Learning Knowledge Distillation (SLKD) framework, specifically designed to compress image restoration models for resource-constrained drones. SLKD introduces a dual-teacher, single-student architecture that integrates two complementary learning strategies: Degradation Removal Learning (DRL) and Image Reconstruction Learning (IRL). In DRL, the student encoder learns to eliminate degradation factors by mimicking Teacher A, which processes degraded images utilizing a BRISQUE-based extractor to capture degradation-sensitive natural scene statistics. Concurrently, in IRL, the student decoder reconstructs clean images by learning from Teacher B, which processes clean images, guided by a PIQE-based extractor that emphasizes the preservation of edge and texture features essential for high-quality reconstruction. This dual-teacher approach enables the student model to learn from both degraded and clean images simultaneously, achieving robust image restoration while significantly reducing computational complexity. Experimental evaluations across five benchmark datasets and three restoration tasks-deraining, deblurring, and dehazing-demonstrate that, compared to the teacher models, the SLKD student models achieve an average reduction of 85.4% in FLOPs and 85.8% in model parameters, with only a slight average decrease of 2.6% in PSNR and 0.9% in SSIM. These results highlight the practicality of integrating SLKD-compressed models into autonomous systems, offering efficient and real-time image restoration for aerial platforms operating in challenging environments.
引用
收藏
页数:23
相关论文
共 50 条
  • [21] Efficient Online Subclass Knowledge Distillation for Image Classification
    Tzelepi, Maria
    Passalis, Nikolaos
    Tefas, Anastasios
    2020 25TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION (ICPR), 2021, : 1007 - 1014
  • [22] Efficient Medical Image Segmentation Based on Knowledge Distillation
    Qin, Dian
    Bu, Jia-Jun
    Liu, Zhe
    Shen, Xin
    Zhou, Sheng
    Gu, Jing-Jun
    Wang, Zhi-Hua
    Wu, Lei
    Dai, Hui-Fen
    IEEE TRANSACTIONS ON MEDICAL IMAGING, 2021, 40 (12) : 3820 - 3831
  • [23] Compression of Acoustic Model via Knowledge Distillation and Pruning
    Li, Chenxing
    Zhu, Lei
    Xu, Shuang
    Gao, Peng
    Xu, Bo
    2018 24TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION (ICPR), 2018, : 2785 - 2790
  • [24] Wavelet Knowledge Distillation: Towards Efficient Image-to-Image Translation
    Zhang, Linfeng
    Chen, Xin
    Tu, Xiaobing
    Wan, Pengfei
    Xu, Ning
    Ma, Kaisheng
    2022 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2022, : 12454 - 12464
  • [25] Efficient Knowledge Distillation from Model Checkpoints
    Wang, Chaofei
    Yang, Qisen
    Huang, Rui
    Song, Shiji
    Huang, Gao
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35 (NEURIPS 2022), 2022,
  • [26] Learning Efficient Object Detection Models with Knowledge Distillation
    Chen, Guobin
    Choi, Wongun
    Yu, Xiang
    Han, Tony
    Chandraker, Manmohan
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 30 (NIPS 2017), 2017, 30
  • [27] Model-based Iterative Restoration for Binary Document Image Compression with Dictionary Learning
    Guo, Yandong
    Lu, Cheng
    Allebach, Jan P.
    Bouman, Charles A.
    30TH IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR 2017), 2017, : 606 - 615
  • [28] Ensemble Knowledge Distillation for Learning Improved and Efficient Networks
    Asif, Umar
    Tang, Jianbin
    Harrer, Stefan
    ECAI 2020: 24TH EUROPEAN CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2020, 325 : 953 - 960
  • [29] SAR Image Compression With Inherent Denoising Capability Through Knowledge Distillation
    Liu, Ziyuan
    Wang, Shaoping
    Gu, Yuantao
    IEEE GEOSCIENCE AND REMOTE SENSING LETTERS, 2024, 21 : 1 - 5
  • [30] MSKD: Structured knowledge distillation for efficient medical image segmentation
    Zhao, Libo
    Qian, Xiaolong
    Guo, Yinghui
    Song, Jiaqi
    Hou, Jinbao
    Gong, Jun
    COMPUTERS IN BIOLOGY AND MEDICINE, 2023, 164