Infrared and Visible Image Fusion Method Using Salience Detection and Convolutional Neural Network

被引:5
|
作者
Wang, Zetian [1 ]
Wang, Fei [1 ,2 ]
Wu, Dan [1 ]
Gao, Guowang [1 ]
机构
[1] Xian Shiyou Univ, Sch Elect Engn, Xian 710065, Peoples R China
[2] Hunan Univ, State Key Lab Adv Design & Mfg Vehicle Body, Changsha 410082, Hunan, Peoples R China
关键词
image fusion; salience detection; convolution neural network; QUALITY ASSESSMENT;
D O I
10.3390/s22145430
中图分类号
O65 [分析化学];
学科分类号
070302 ; 081704 ;
摘要
This paper presents an algorithm for infrared and visible image fusion using significance detection and Convolutional Neural Networks with the aim of integrating discriminatory features and improving the overall quality of visual perception. Firstly, a global contrast-based significance detection algorithm is applied to the infrared image, so that salient features can be extracted, highlighting high brightness values and suppressing low brightness values and image noise. Secondly, a special loss function is designed for infrared images to guide the extraction and reconstruction of features in the network, based on the principle of salience detection, while the more mainstream gradient loss is used as the loss function for visible images in the network. Afterwards, a modified residual network is applied to complete the extraction of features and image reconstruction. Extensive qualitative and quantitative experiments have shown that fused images are sharper and contain more information about the scene, and the fused results look more like high-quality visible images. The generalization experiments also demonstrate that the proposed model has the ability to generalize well, independent of the limitations of the sensor. Overall, the algorithm proposed in this paper performs better compared to other state-of-the-art methods.
引用
收藏
页数:17
相关论文
共 50 条
  • [41] VISIBLE AND INFRARED IMAGE FUSION USING ENCODER-DECODER NETWORK
    Ataman, Ferhat Can
    Bozdagi Akar, Gozde
    2021 IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING (ICIP), 2021, : 1779 - 1783
  • [42] Intelligent Fusion of Infrared and Visible Image Data Based on Convolutional Sparse Representation and Improved Pulse-Coupled Neural Network
    Xia, Jingming
    Lu, Yi
    Tan, Ling
    Jiang, Ping
    CMC-COMPUTERS MATERIALS & CONTINUA, 2021, 67 (01): : 613 - 624
  • [43] Potential evaluation of visible-thermal UAV image fusion for individual tree detection based on convolutional neural network
    Moradi, Fatemeh
    Javan, Farzaneh Dadrass
    Samadzadegan, Farhad
    INTERNATIONAL JOURNAL OF APPLIED EARTH OBSERVATION AND GEOINFORMATION, 2022, 113
  • [44] DepthFuseNet: an approach for fusion of thermal and visible images using a convolutional neural network
    Patel, Heena
    Upla, Kishor P.
    OPTICAL ENGINEERING, 2021, 60 (01)
  • [45] STDFusionNet: An Infrared and Visible Image Fusion Network Based on Salient Target Detection
    Ma, Jiayi
    Tang, Linfeng
    Xu, Meilong
    Zhang, Hao
    Xiao, Guobao
    IEEE TRANSACTIONS ON INSTRUMENTATION AND MEASUREMENT, 2021, 70
  • [46] Pedestrian detection-driven cascade network for infrared and visible image fusion
    Zheng, Bowen
    Huo, Hongtao
    Liu, Xiaowen
    Pang, Shan
    Li, Jing
    SIGNAL PROCESSING, 2024, 225
  • [47] Multi-exposure image fusion using convolutional neural network
    Akbulut, Harun
    Aslantas, Veysel
    JOURNAL OF THE FACULTY OF ENGINEERING AND ARCHITECTURE OF GAZI UNIVERSITY, 2023, 38 (03): : 1439 - 1451
  • [48] MSFNet: MultiStage Fusion Network for infrared and visible image fusion
    Wang, Chenwu
    Wu, Junsheng
    Zhu, Zhixiang
    Chen, Hao
    NEUROCOMPUTING, 2022, 507 : 26 - 39
  • [49] SCGAFusion: A skip-connecting group convolutional attention network for infrared and visible image fusion
    Zhu, Danchen
    Ma, Jingbin
    Li, Dong
    Wang, Xiaoming
    APPLIED SOFT COMPUTING, 2024, 163
  • [50] Analysis of image forgery detection using convolutional neural network
    Gnaneshwar C.
    Singh M.K.
    Yadav S.S.
    Balabantaray B.K.
    International Journal of Applied Systemic Studies, 2022, 9 (03) : 240 - 260