Hierarchical accumulation network with grid attention for image super-resolution

被引:10
|
作者
Yang, Yue [1 ]
Qi, Yong [1 ]
机构
[1] Xi An Jiao Tong Univ, Sch Comp Sci & Technol, Xian, Shaanxi, Peoples R China
关键词
Image super-resolution; Grouping; Attention mechanism; Accumulation network;
D O I
10.1016/j.knosys.2021.107520
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Deep convolutional neural networks (CNNs) have recently shown promising results in single image super-resolution (SISR) due to their powerful representation ability. However, existing CNN-based SR methods mainly focus on deeper architecture design to obtain high-level semantic information, neglecting the features of intermediate layers containing fine-grained texture information and thus limiting the capacity for producing precise high-resolution images. To tackle this issue, we propose a hierarchical accumulation network (HAN) with grid attention in this paper. Specifically, a hierarchical feature accumulation (HFA) structure is proposed to accumulate outputs of intermediate layers in a grouping manner for exploiting the features of different semantic levels. Moreover, we introduce a multi-scale grid attention module (MGAM) to refine features of the same level. The MGAM employs a pyramid sampling with self-attention mechanism to efficiently model the non-local dependencies between pixel features and produces refined representations. By this means, the universal features in connection with spatial similarity and semantic levels are produced for image SR. Experimental results on five benchmark datasets with different degradation models demonstrate the superiority of our HAN in terms of quantitative metrics and visual quality. (c) 2021 Elsevier B.V. All rights reserved.
引用
收藏
页数:12
相关论文
共 50 条
  • [31] Cross-resolution feature attention network for image super-resolution
    Liu, Anqi
    Li, Sumei
    Chang, Yongli
    VISUAL COMPUTER, 2023, 39 (09): : 3837 - 3849
  • [32] Lightweight image super-resolution with multiscale residual attention network
    Xiao, Cunjun
    Dong, Hui
    Li, Haibin
    Li, Yaqian
    Zhang, Wenming
    JOURNAL OF ELECTRONIC IMAGING, 2022, 31 (04)
  • [33] SRGAT: Single Image Super-Resolution With Graph Attention Network
    Yan, Yanyang
    Ren, Wenqi
    Hu, Xiaobin
    Li, Kun
    Shen, Haifeng
    Cao, Xiaochun
    IEEE TRANSACTIONS ON IMAGE PROCESSING, 2021, 30 : 4905 - 4918
  • [34] Information-Growth Attention Network for Image Super-Resolution
    Li, Zhuangzi
    Li, Ge
    Li, Thomas
    Liu, Shan
    Gao, Wei
    PROCEEDINGS OF THE 29TH ACM INTERNATIONAL CONFERENCE ON MULTIMEDIA, MM 2021, 2021, : 544 - 552
  • [35] Lightweight image super-resolution with sliding Proxy Attention Network
    Hu, Zhenyu
    Sun, Wanjie
    Chen, Zhenzhong
    SIGNAL PROCESSING, 2025, 227
  • [36] Deep coordinate attention network for single image super-resolution
    Xie, Chao
    Zhu, Hongyu
    Fei, Yeqi
    IET IMAGE PROCESSING, 2022, 16 (01) : 273 - 284
  • [37] PYRAMID FUSION ATTENTION NETWORK FOR SINGLE IMAGE SUPER-RESOLUTION
    He, Hao
    Du, Zongcai
    Li, Wenfeng
    Tang, Jie
    Wu, Gangshan
    2022 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2022, : 2165 - 2169
  • [38] Omnidirectional image super-resolution via position attention network
    Wang, Xin
    Wang, Shiqi
    Li, Jinxing
    Li, Mu
    Li, Jinkai
    Xu, Yong
    NEURAL NETWORKS, 2024, 178
  • [39] Channel attention and residual concatenation network for image super-resolution
    Cai T.-J.
    Peng X.-Y.
    Shi Y.-P.
    Huang J.
    Peng, Xiao-Yu (pengxy96@qq.com), 1600, Chinese Academy of Sciences (29): : 142 - 151
  • [40] Multi-scale attention network for image super-resolution
    Wang, Li
    Shen, Jie
    Tang, E.
    Zheng, Shengnan
    Xu, Lizhong
    JOURNAL OF VISUAL COMMUNICATION AND IMAGE REPRESENTATION, 2021, 80