A semi-parallel CNN-transformer fusion network for semantic change detection

被引:1
|
作者
Zou, Changzhong [1 ]
Wang, Ziyuan [1 ]
机构
[1] Fuzhou Univ, Coll Comp & Data Sci, Fuzhou 350000, Peoples R China
关键词
Fusion semantic change detection network; (FSCD); Transformer; Convolutional neural network (CNN); Siamese; UNSUPERVISED CHANGE DETECTION; IMAGE;
D O I
10.1016/j.imavis.2024.105157
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Semantic change detection (SCD) can recognize the region and the type of changes in remote sensing images. Existing methods are either based on transformer or convolutional neural network (CNN), but due to the size of various ground objects is different, it is necessary to have global modeling ability and local information extraction ability at the same time. Therefore, in this paper we propose a fusion semantic change detection network (FSCD) with both global modeling ability and local information extraction ability by fusing transformer and CNN. A semi-parallel fusion block has also been proposed to construct FSCD. It can not only have global and local features in parallel, but also fuse them as deeply as serial. To better adaptively decide which mechanism is applied to which pixel, we design a self-attention and convolution selection module (ACSM). ACSM is a selfattention mechanism used to selectively combine transformer and CNN. Specifically, the importance of each mechanism is automatically obtained by learning. According to the importance, the mechanism suitable for a pixel is selected, which is better than using either mechanism alone. We evaluate the proposed FSCD on two datasets, and the proposed network has a significant improvement compared with the state-of-the-art network.
引用
收藏
页数:12
相关论文
共 50 条
  • [41] Harmful Cyanobacterial Blooms forecasting based on improved CNN-Transformer and Temporal Fusion Transformer
    Ahn, Jung Min
    Kim, Jungwook
    Kim, Hongtae
    Kim, Kyunghyun
    ENVIRONMENTAL TECHNOLOGY & INNOVATION, 2023, 32
  • [42] A Lightweight CNN-Transformer Network With Laplacian Loss for Low-Altitude UAV Imagery Semantic Segmentation
    Lu, Wen
    Zhang, Zhiqi
    Nguyen, Minh
    IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, 2024, 62 : 1 - 20
  • [43] TFCNs: A CNN-Transformer Hybrid Network for Medical Image Segmentation
    Li, Zihan
    Li, Dihan
    Xu, Cangbai
    Wang, Weice
    Hong, Qingqi
    Li, Qingde
    Tian, Jie
    ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING - ICANN 2022, PT IV, 2022, 13532 : 781 - 792
  • [44] Image Deblurring Based on an Improved CNN-Transformer Combination Network
    Chen, Xiaolin
    Wan, Yuanyuan
    Wang, Donghe
    Wang, Yuqing
    APPLIED SCIENCES-BASEL, 2023, 13 (01):
  • [45] MedFCT: A Frequency Domain Joint CNN-Transformer Network for Semi-supervised Medical Image Segmentation
    Xie, Shiao
    Huang, Huimin
    Niu, Ziwei
    Lin, Lanfen
    Chen, Yen-Wei
    2023 IEEE INTERNATIONAL CONFERENCE ON MULTIMEDIA AND EXPO, ICME, 2023, : 1913 - 1918
  • [46] GhostFormer: Efficiently amalgamated CNN-transformer architecture for object detection
    Xie, Xin
    Wu, Dengquan
    Xie, Mingye
    Li, Zixi
    PATTERN RECOGNITION, 2024, 148
  • [47] Infrared and Visible Image Fusion Based on Autoencoder Composed of CNN-Transformer
    Wang, Hongmei
    Li, Lin
    Li, Chenkai
    Lu, Xuanyu
    IEEE ACCESS, 2023, 11 : 78956 - 78969
  • [48] A CNN-Transformer Combined Remote Sensing Imagery Spatiotemporal Fusion Model
    Jiang, Mingyu
    Shao, Hua
    IEEE JOURNAL OF SELECTED TOPICS IN APPLIED EARTH OBSERVATIONS AND REMOTE SENSING, 2024, 17 : 13995 - 14009
  • [49] Spatiotemporal Multivariate Weather Prediction Network Based on CNN-Transformer
    Wu, Ruowu
    Liang, Yandan
    Lin, Lianlei
    Zhang, Zongwei
    SENSORS, 2024, 24 (23)
  • [50] A Good Student is Cooperative and Reliable: CNN-Transformer Collaborative Learning for Semantic Segmentation
    Zhu, Jinjing
    Luo, Yunhao
    Zheng, Xu
    Wang, Hao
    Wang, Lin
    2023 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2023), 2023, : 11686 - 11696