Infrared Image Super-Resolution via Progressive Compact Distillation Network

被引:4
|
作者
Fan, Kefeng [1 ,2 ]
Hong, Kai [2 ]
Li, Fei [3 ]
机构
[1] China Elect Standardizat Inst, Informat Technol Res Ctr, Beijing 100007, Peoples R China
[2] Guilin Univ Elect Technol, Dept Elect Engn & Automat, Guilin 541004, Peoples R China
[3] PengCheng Lab, Shenzhen 518055, Peoples R China
关键词
infrared image super-resolution; information distillation; lightweight network; transfer learning;
D O I
10.3390/electronics10243107
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Deep convolutional neural networks are capable of achieving remarkable performance in single-image super-resolution (SISR). However, due to the weak availability of infrared images, heavy network architectures for insufficient infrared images are confronted by excessive parameters and computational complexity. To address these issues, we propose a lightweight progressive compact distillation network (PCDN) with a transfer learning strategy to achieve infrared image super-resolution reconstruction with a few samples. We design a progressive feature residual distillation (PFDB) block to efficiently refine hierarchical features, and parallel dilation convolutions are utilized to expand PFDB's receptive field, thereby maximizing the characterization power of marginal features and minimizing the network parameters. Moreover, the bil-global connection mechanism and the difference calculation algorithm between two adjacent PFDBs are proposed to accelerate the network convergence and extract the high-frequency information, respectively. Furthermore, we introduce transfer learning to fine-tune network weights with few-shot infrared images to obtain infrared image mapping information. Experimental results suggest the effectiveness and superiority of the proposed framework with low computational load in infrared image super-resolution. Notably, our PCDN outperforms existing methods on two public datasets for both x2 and x4 with parameters less than 240 k, proving its efficient and excellent reconstruction performance.
引用
收藏
页数:11
相关论文
共 50 条
  • [41] PFFN: Progressive Feature Fusion Network for Lightweight Image Super-Resolution
    Zhang, Dongyang
    Li, Changyu
    Xie, Ning
    Wang, Guoqing
    Shao, Jie
    PROCEEDINGS OF THE 29TH ACM INTERNATIONAL CONFERENCE ON MULTIMEDIA, MM 2021, 2021, : 3682 - 3690
  • [42] Attention Network with Information Distillation for Super-Resolution
    Zang, Huaijuan
    Zhao, Ying
    Niu, Chao
    Zhang, Haiyan
    Zhan, Shu
    ENTROPY, 2022, 24 (09)
  • [43] PFAN: progressive feature aggregation network for lightweight image super-resolution
    Chen, Liqiong
    Yang, Xiangkun
    Wang, Shu
    Shen, Ying
    Wu, Jing
    Huang, Feng
    Qiu, Zhaobing
    VISUAL COMPUTER, 2025,
  • [44] Volatile-Nonvolatile Memory Network for Progressive Image Super-Resolution
    Choi, Jun-Ho
    Kim, Jun-Hyuk
    Cheon, Manri
    Lee, Jong-Seok
    IEEE ACCESS, 2021, 9 : 37487 - 37496
  • [45] 1D kernel distillation network for efficient image super-resolution
    Li, Yusong
    Xu, Longwei
    Yang, Weibin
    Geng, Dehua
    Xu, Mingyuan
    Dong, Zhiqi
    Wang, Pengwei
    IMAGE AND VISION COMPUTING, 2025, 154
  • [46] Lightweight multi-scale distillation attention network for image super-resolution
    Tang, Yinggan
    Hu, Quanwei
    Bu, Chunning
    KNOWLEDGE-BASED SYSTEMS, 2025, 309
  • [47] Multi-scale information distillation network for efficient image super-resolution
    Hu, Yanting
    Huang, Yuanfei
    Zhang, Kaibing
    KNOWLEDGE-BASED SYSTEMS, 2023, 275
  • [48] Face Super-Resolution via Progressive-Scale Boosting Network
    Wang, Yiyi
    Lu, Tao
    Wang, Jiaming
    Xu, Aibo
    WEB AND BIG DATA, PT II, APWEB-WAIM 2023, 2024, 14332 : 44 - 57
  • [49] Progressive residual networks for image super-resolution
    Jin Wan
    Hui Yin
    Ai-Xin Chong
    Zhi-Hao Liu
    Applied Intelligence, 2020, 50 : 1620 - 1632
  • [50] Progressive residual networks for image super-resolution
    Wan, Jin
    Yin, Hui
    Chong, Ai-Xin
    Liu, Zhi-Hao
    APPLIED INTELLIGENCE, 2020, 50 (05) : 1620 - 1632