Dual Branch Masked Transformer for Hyperspectral Image Classification

被引:0
|
作者
Li, Kuo [1 ]
Chen, Yushi [1 ]
Huang, Lingbo [1 ]
机构
[1] Harbin Inst Technol, Sch Elect & Informat Engn, Harbin 150001, Peoples R China
关键词
Transformers; Feature extraction; Image reconstruction; Decoding; Training; Hyperspectral imaging; Data mining; Tokenization; Principal component analysis; Computer vision; Classification; hyperspectral image (HSI); masked image modeling; pretraining; transformer;
D O I
10.1109/LGRS.2024.3490534
中图分类号
P3 [地球物理学]; P59 [地球化学];
学科分类号
0708 ; 070902 ;
摘要
Transformer has been widely used in hyperspectral image (HSI) classification tasks because of its ability to capture long-range dependencies. However, most Transformer-based classification methods lack the extraction of local information or do not combine spatial and spectral information well, resulting in insufficient extraction of features. To address these issues, in this study, a dual-branch masked Transformer (Dual-MTr) model is proposed. Masked Transformer (MTr) is used to pretrain vision transformer (ViT) by reconstruction of both masked spatial image and spectral spectrum, which embeds the local bias by the process of recovering from localized patches to the global original input. Different tokenization methods are used for different types of input data. Patch embedding with overlapping regions is used for 2-D spatial data and group embedding is used for 1-D spectral data. Supervised learning has been added to the pretraining process to enhance strong discriminability. Then, the dual-branch structure is proposed to combine the spatial and spectral features. To strengthen the connection between the two branches better, Kullback-Leibler (KL) divergence is used to measure the differences between the classification results of the two branches, and the loss resulting from the computed differences is incorporated into the training process. Experimental results from two hyperspectral datasets demonstrate the effectiveness of the proposed method compared to other methods.
引用
收藏
页数:5
相关论文
共 50 条
  • [1] Dual-Branch Adaptive Convolutional Transformer for Hyperspectral Image Classification
    Wang, Chuanzhi
    Huang, Jun
    Lv, Mingyun
    Wu, Yongmei
    Qin, Ruiru
    REMOTE SENSING, 2024, 16 (09)
  • [2] A Dual-Branch Multiscale Transformer Network for Hyperspectral Image Classification
    Shi, Cuiping
    Yue, Shuheng
    Wang, Liguo
    IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, 2024, 62 : 1 - 20
  • [3] A Center-Masked Transformer for Hyperspectral Image Classification
    Jia, Sen
    Wang, Yifan
    Jiang, Shuguo
    He, Ruyan
    IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, 2024, 62 : 1 - 16
  • [4] Two-Branch Pure Transformer for Hyperspectral Image Classification
    He, Xin
    Chen, Yushi
    Li, Qingyun
    IEEE GEOSCIENCE AND REMOTE SENSING LETTERS, 2022, 19
  • [5] Two-Branch Pure Transformer for Hyperspectral Image Classification
    He, Xin
    Chen, Yushi
    Li, Qingyun
    IEEE Geoscience and Remote Sensing Letters, 2022, 19
  • [6] A Dual Frequency Transformer Network for Hyperspectral Image Classification
    Qiao, Xin
    Huang, Weimin
    IEEE JOURNAL OF SELECTED TOPICS IN APPLIED EARTH OBSERVATIONS AND REMOTE SENSING, 2023, 16 : 10344 - 10358
  • [7] Dual attention transformer network for hyperspectral image classification
    Shu, Zhenqiu
    Wang, Yuyang
    Yu, Zhengtao
    ENGINEERING APPLICATIONS OF ARTIFICIAL INTELLIGENCE, 2024, 127
  • [8] SPECTRAL-SPATIAL DUAL-BRANCH CROSS-ENHANCED TRANSFORMER FOR HYPERSPECTRAL IMAGE CLASSIFICATION
    Zhang, Hang
    Zhan, Tianming
    Sun, Le
    2024 IEEE INTERNATIONAL GEOSCIENCE AND REMOTE SENSING SYMPOSIUM (IGARSS 2024), 2024, : 8975 - 8978
  • [9] Double-branch feature fusion transformer for hyperspectral image classification
    Lanxue Dang
    Libo Weng
    Yane Hou
    Xianyu Zuo
    Yang Liu
    Scientific Reports, 13
  • [10] Double-branch feature fusion transformer for hyperspectral image classification
    Dang, Lanxue
    Weng, Libo
    Hou, Yane
    Zuo, Xianyu
    Liu, Yang
    SCIENTIFIC REPORTS, 2023, 13 (01)