Image Restoration via Deep Memory-Based Latent Attention Network

被引:6
|
作者
Zhang, Xinyan [1 ,2 ]
Gao, Peng [3 ]
Zhao, Kongya [1 ,2 ]
Liu, Sunxiangyu [1 ,2 ]
Li, Guitao [1 ]
Yin, Liuguo [2 ]
机构
[1] Tsinghua Univ, Sch Aerosp Engn, Beijing 100084, Peoples R China
[2] Beijing Natl Res Ctr Informat Sci & Technol, Beijing 100084, Peoples R China
[3] Peking Univ, Coll Engn, Beijing 100871, Peoples R China
来源
IEEE ACCESS | 2020年 / 8卷
基金
中国国家自然科学基金;
关键词
Image restoration; deep learning; deep memory-based network; latent attention block; SUPERRESOLUTION; ALGORITHM;
D O I
10.1109/ACCESS.2020.2999965
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Deep convolutional neural network (CNN) has made impressive achievements in the field of image restoration. However, most of deep CNN-based models have limited capability of utilizing the hierarchical features and these features are often treated equally, thus restricting the restoration performance. To address this issue, the present work proposes a novel memory-based latent attention network (MLANet) aiming to effectively restore a high-quality image from a corresponding low-quality one. The key idea of this work is to employ a memory-based latent attention block (MLAB), which is stacked in MLANet and makes better use of global and local features through the network. Specifically, the MLAB contains a main branch and a latent branch. The former is used to extract local multi-level features, and the latter preserves global information by the structure within a latent design. Furthermore, a multi-kernel attention module is incorporated into the latent branch to adaptively learn more effective features with mixed attention. To validate the effectiveness and generalization ability, MLANet is evaluated on three representative image restoration tasks: image super-resolution, image denoising, and image compression artifact reduction. Experimental results show that MLANet performs better than the state-of-the-art methods on all the tasks.
引用
收藏
页码:104728 / 104739
页数:12
相关论文
共 50 条
  • [1] Latent Relational Metric Learning via Memory-based Attention for Collaborative Ranking
    Tay, Yi
    Luu Anh Tuan
    Hui, Siu Cheung
    WEB CONFERENCE 2018: PROCEEDINGS OF THE WORLD WIDE WEB CONFERENCE (WWW2018), 2018, : 729 - 739
  • [2] Memory-based Attention Graph Neural Network for Network Expert Recommendation
    Chen Z.
    Zhu M.
    Du J.
    Yuan X.
    Hunan Daxue Xuebao/Journal of Hunan University Natural Sciences, 2022, 49 (06): : 116 - 123
  • [3] Memory-Based Sequential Attention
    Stock, Jason
    Anderson, Charles
    GAZE MEETS MACHINE LEARNING WORKSHOP, 2023, 226 : 236 - 252
  • [4] Blind restoration of astronomical image based on deep attention generative adversarial neural network
    Luo, Lin
    Bao, Jiaqi
    Li, Jinlong
    Gao, Xiaorong
    OPTICAL ENGINEERING, 2022, 61 (01)
  • [5] A Memory-Based Realization of a Binarized Deep Convolutional Neural Network
    Nakahara, Hiroki
    Yonekawa, Haruyoshi
    Sasao, Tsutomu
    Iwamoto, Hisashi
    Motomura, Masato
    2016 INTERNATIONAL CONFERENCE ON FIELD-PROGRAMMABLE TECHNOLOGY (FPT), 2016, : 277 - 280
  • [6] On altering motion perception via working memory-based attention shifts
    Turatto, Massimo
    Vescovi, Massimo
    Valsecchi, Matteo
    JOURNAL OF VISION, 2008, 8 (05):
  • [7] An efficient image encryption algorithm using a discrete memory-based logistic map with deep neural network
    Kumar, B. Sakthi
    Revathi, R.
    Journal of Engineering and Applied Science, 2024, 71 (01):
  • [8] DSegAN: A Deep Light-weight Segmentation-based Attention Network for Image Restoration
    Esmaeilzehi, Alireza
    Ahmad, M. Omair
    Swamy, M. N. S.
    2022 IEEE INTERNATIONAL SYMPOSIUM ON CIRCUITS AND SYSTEMS (ISCAS 22), 2022, : 1284 - 1288
  • [9] Light Field Image Restoration via Latent Diffusion and Multi-View Attention
    Zhang, Shansi
    Lam, Edmund Y.
    IEEE SIGNAL PROCESSING LETTERS, 2024, 31 : 1094 - 1098
  • [10] Feature memory-based deep recurrent neural network for language modeling
    Deng, Hongli
    Zhang, Lei
    Shu, Xin
    APPLIED SOFT COMPUTING, 2018, 68 : 432 - 446