EAdderSR: enhanced AdderSR for single image super resolution

被引:0
|
作者
Song, Jie [1 ]
Yi, Huawei [1 ]
Xu, Wenqian [1 ]
Li, Xiaohui [1 ]
Li, Bo [1 ]
Liu, Yuanyuan [2 ]
机构
[1] Liaoning Univ Technol, Sch Elect & Informat Engn, Jinzhou 121001, Peoples R China
[2] Dalian Univ, Key Lab Adv Design & Intelligent Comp, Minist Educ, Dalian 116622, Peoples R China
关键词
Super resolution; Adder neural network; Computation complexity; Knowledge distillation; SUPERRESOLUTION; NETWORKS;
D O I
10.1007/s10489-023-04536-1
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Replacing multiplication with addition can effectively reduce the computational complexity. Based on this idea, adder neural networks (AdderNets) are proposed. Thereafter, AdderNets are applied to super-resolution (SR) task to obtain AdderSR, which significantly reduces the energy consumption caused by SR models. However, the weak fitting ability of AdderNets makes AdderSR only applicable to the low-complexity pixel-wise loss, and the performance of the model drops sharply when the high-complexity perceptual loss is used. Enhanced AdderSR (EAdderSR) is proposed to overcome the limitations of AdderSR in SR tasks. Specifically, current adder networks have serious gradient precision loss problem, which affects the training stability. The normalization layer is adjusted to normalize the output of the adder layer to a reasonably narrow range, which can reduce the amount of precision loss. Then, a coarse-grained knowledge distillation (CGKD) method is developed to give adder networks an efficient guidance to reduce the fitting burden. The experimental results show that the proposed method not only further improves the performance of adder networks, but also ensures the quality of the output results when the complexity of the loss function increases.
引用
收藏
页码:20998 / 21011
页数:14
相关论文
共 50 条
  • [31] A New Approach In Single Image Super Resolution
    Sarmadi, Saeideh
    Shamsa, Zari
    2016 6TH INTERNATIONAL CONFERENCE ON COMPUTER AND KNOWLEDGE ENGINEERING (ICCKE), 2016, : 78 - 81
  • [32] Lightweight image super-resolution with enhanced CNN
    Tian, Chunwei
    Zhuge, Ruibin
    Wu, Zhihao
    Xu, Yong
    Zuo, Wangmeng
    Chen, Chen
    Lin, Chia-Wen
    KNOWLEDGE-BASED SYSTEMS, 2020, 205
  • [33] Enhanced Context Attention Network for Image Super Resolution
    Xu, Wang
    Chen, Renwen
    Huang, Bin
    Zhou, Qinbang
    IEEE SENSORS JOURNAL, 2021, 21 (10) : 11665 - 11673
  • [34] ESinGAN: Enhanced Single-Image GAN Using Pixel Attention Mechanism for Image Super-Resolution
    Sun, Wenyue
    Liu, Bao-Di
    PROCEEDINGS OF 2020 IEEE 15TH INTERNATIONAL CONFERENCE ON SIGNAL PROCESSING (ICSP 2020), 2020, : 181 - 186
  • [35] A hybrid non-local enhanced network for lightweight single image super-resolution
    He, Zheng
    Yang, Haoran
    Chen, Lihui
    Jeon, Gwanggil
    Liu, Yiguang
    Yang, Xiaomin
    INTERNATIONAL JOURNAL OF ADVANCED MANUFACTURING TECHNOLOGY, 2024, 133 (1-2): : 1025 - 1025
  • [36] Fast and Accurate Single Image Super- Resolution via Enhanced U-Net
    Chang, Le
    Zhang, Fan
    Li, Biao
    KSII TRANSACTIONS ON INTERNET AND INFORMATION SYSTEMS, 2021, 15 (04): : 1246 - 1262
  • [37] Generative Adversarial Networks with Enhanced Symmetric Residual Units for Single Image Super-Resolution
    Wu, Xianyu
    Li, Xiaojie
    He, Jia
    Wu, Xi
    Mumtaz, Imran
    MULTIMEDIA MODELING (MMM 2019), PT I, 2019, 11295 : 483 - 494
  • [38] Single Image Super Resolution via Neighbor Reconstruction
    Zhang, Zhihong
    Xu, Zhuobin
    Ye, Zhiling
    Hu, Yiqun
    Cui, Lixin
    Bai, Lu
    STRUCTURAL, SYNTACTIC, AND STATISTICAL PATTERN RECOGNITION, S+SSPR 2018, 2018, 11004 : 406 - 415
  • [39] Single-frame Super Resolution Image Embedding
    Sahu, Indu
    Dewangan, Ashish
    2015 INTERNATIONAL CONFERENCE ON ELECTRICAL, ELECTRONICS, SIGNALS, COMMUNICATION AND OPTIMIZATION (EESCO), 2015,
  • [40] Single Image Super Resolution Using Joint Regularization
    Chang, Kan
    Ding, Pak Lun Kevin
    Li, Baoxin
    IEEE SIGNAL PROCESSING LETTERS, 2018, 25 (04) : 596 - 600