Infrared and Visible Image Fusion via Interactive Compensatory Attention Adversarial Learning

被引:16
|
作者
Wang, Zhishe [1 ]
Shao, Wenyu [1 ]
Chen, Yanlin [1 ]
Xu, Jiawei [2 ]
Zhang, Xiaoqin [2 ]
机构
[1] Taiyuan Univ Sci & Technol, Sch Appl Sci, Taiyuan 030024, Peoples R China
[2] Wenzhou Univ, Key Lab Intelligent Informat Safety & Emergency Z, Wenzhou 325035, Peoples R China
基金
中国国家自然科学基金;
关键词
Image fusion; attention interaction; attention compensation; dual discriminators; adversarial learning; NETWORK; NEST;
D O I
10.1109/TMM.2022.3228685
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
The existing generative adversarial fusion methods generally concatenate source images or deep features, and extract local features through convolutional operations without considering their global characteristics, which tends to produce a limited fusion performance. Toward this end, we propose a novel interactive compensatory attention fusion network, termed ICAFusion. In particular, in the generator, we construct a multi-level encoder-decoder network with a triple path, and design infrared and visible paths to provide additional intensity and gradient information for the concatenating path. Moreover, we develop the interactive and compensatory attention modules to communicate their pathwise information, and model their long-range dependencies through a cascading channel-spatial model. The generated attention maps can more focus on infrared target perception and visible detail characterization, and are used to reconstruct the fusion image. Therefore, the generator takes full advantage of local and global features to further increase the representation ability of feature extraction and feature reconstruction. Extensive experiments illustrate that our ICAFusion obtains superior fusion performance and better generalization ability, which precedes other advanced methods in the subjective visual description and objective metric evaluation.
引用
收藏
页码:7800 / 7813
页数:14
相关论文
共 50 条
  • [41] Multiscale channel attention network for infrared and visible image fusion
    Zhu, Jiahui
    Dou, Qingyu
    Jian, Lihua
    Liu, Kai
    Hussain, Farhan
    Yang, Xiaomin
    CONCURRENCY AND COMPUTATION-PRACTICE & EXPERIENCE, 2021, 33 (22):
  • [42] Unsupervised densely attention network for infrared and visible image fusion
    Yang Li
    Jixiao Wang
    Zhuang Miao
    Jiabao Wang
    Multimedia Tools and Applications, 2020, 79 : 34685 - 34696
  • [43] DFPGAN: Dual fusion path generative adversarial network for infrared and visible image fusion
    Yi, Shi
    Li, Junjie
    Yuan, Xuesong
    INFRARED PHYSICS & TECHNOLOGY, 2021, 119
  • [44] Infrared and visible image fusion via gradientlet filter
    Ma, Jiayi
    Zhou, Yi
    COMPUTER VISION AND IMAGE UNDERSTANDING, 2020, 197
  • [45] Infrared and Visible Image Fusion via Decoupling Network
    Wang, Xue
    Guan, Zheng
    Yu, Shishuang
    Cao, Jinde
    Li, Ya
    IEEE TRANSACTIONS ON INSTRUMENTATION AND MEASUREMENT, 2022, 71
  • [46] Infrared and Visible Image Fusion Based on Adversarial Feature Extraction and Stable Image Reconstruction
    Su, Weijian
    Huang, Yongdong
    Li, Qiufu
    Zuo, Fengyuan
    Liu, Lijun
    IEEE TRANSACTIONS ON INSTRUMENTATION AND MEASUREMENT, 2022, 71
  • [47] Infrared image denoising via adversarial learning with multi-level feature attention network
    Yang, Pengfei
    Wu, Heng
    Cheng, Lianglun
    Luo, Shaojuan
    INFRARED PHYSICS & TECHNOLOGY, 2023, 128
  • [48] Visible and Infrared Image Fusion Using Deep Learning
    Zhang, Xingchen
    Demiris, Yiannis
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2023, 45 (08) : 10535 - 10554
  • [49] Infrared and visible image fusion via mixed-frequency hierarchical guided learning
    Zhang, Pengjun
    Jin, Wei
    Gong, Zhaohui
    Zhang, Zejian
    Wu, Zhiwei
    INFRARED PHYSICS & TECHNOLOGY, 2023, 135
  • [50] Convolution dictionary learning for visible-infrared image fusion via local processing
    Zhang, Chengfang
    PROCEEDINGS OF THE 10TH INTERNATIONAL CONFERENCE OF INFORMATION AND COMMUNICATION TECHNOLOGY, 2021, 183 : 609 - 615