Crossmodal Sequential Interaction Network for Hyperspectral and LiDAR Data Joint Classification

被引:2
|
作者
Yu, Wenbo [1 ,2 ]
Huang, He [1 ]
Shen, Yi [3 ]
Shen, Gangxiang [1 ]
机构
[1] Soochow Univ, Sch Elect & Informat Engn, Suzhou 215006, Peoples R China
[2] Chinese Acad Sci, Aerosp Informat Res Inst, State Key Lab Remote Sensing Sci, Beijing 100101, Peoples R China
[3] Harbin Inst Technol, Dept Control Sci & Engn, Harbin 150001, Peoples R China
关键词
Laser radar; Three-dimensional displays; Spatial diversity; Single-photon avalanche diodes; Streams; Feature extraction; Task analysis; Crossmodal sequential characteristic (seqCHA); hyperspectral (HS); joint classification; light detection and ranging (LiDAR); multimodality;
D O I
10.1109/LGRS.2024.3365715
中图分类号
P3 [地球物理学]; P59 [地球化学];
学科分类号
0708 ; 070902 ;
摘要
Numerous deep learning (DL) studies have indicated that fusing hyperspectral (HS) and light detection and ranging (LiDAR) data is effective for land-cover classification. However, the sequential characteristics (seqCHAs) in the spatial domain are always ambiguous and neglected. In this letter, we propose a deep crossmodal sequential interaction network (CsiNet) for HS and LiDAR data joint classification. We aim to verify the contributions of crossmodal seqCHAs in multimodal joint classification tasks and present an effective crossmodal sequential flattening (SF) strategy. Specifically, CsiNet sorts the neighboring samples in terms of the spectral and 3-D spatial diversities between the corresponding samples and the central one. Notably, the 3-D spatial diversity considers the shared sample positions in both modalities and the sample elevations in LiDAR data simultaneously. CsiNet is capable of extracting crossmodal sequential features comprehensively by long and short-term memory (LSTM) layers and better simulating sequential properties of samples compared with convolutional layer based networks. Experiments conducted on the Muufl Gulfport (MUUFL) and Houston 2013 datasets prove that CsiNet outperforms several state-of-the-art techniques qualitatively and quantitatively. When using 1% training samples per category, the overall accuracies of CsiNet on both datasets achieve 90.27% and 92.41% and are increased by 0.27% and 0.17% than the best comparison technique, respectively. Ablation experiments verify the effectiveness of CsiNet by replacing the crossmodal SF strategy with several alternative ones.
引用
收藏
页码:1 / 5
页数:5
相关论文
共 50 条
  • [21] A novel graph-attention based multimodal fusion network for joint classification of hyperspectral image and LiDAR data
    Cai, Jianghui
    Zhang, Min
    Yang, Haifeng
    He, Yanting
    Yang, Yuqing
    Shi, Chenhui
    Zhao, Xujun
    Xun, Yaling
    EXPERT SYSTEMS WITH APPLICATIONS, 2024, 249
  • [22] Joint Classification of Hyperspectral and LiDAR Data Using Height Information Guided Hierarchical Fusion-and-Separation Network
    Song, Tiecheng
    Zeng, Zheng
    Gao, Chenqiang
    Chen, Haonan
    Li, Jun
    IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, 2024, 62 : 1 - 15
  • [23] Joint Classification of Hyperspectral and LiDAR Data via Multiprobability Decision Fusion Method
    Chen, Tao
    Chen, Sizuo
    Chen, Luying
    Chen, Huayue
    Zheng, Bochuan
    Deng, Wu
    REMOTE SENSING, 2024, 16 (22)
  • [24] Multi-Scale Feature Extraction for Joint Classification of Hyperspectral and LiDAR Data
    Xi Y.
    Ye Z.
    Journal of Beijing Institute of Technology (English Edition), 2023, 32 (01): : 13 - 22
  • [25] Joint Classification of Hyperspectral Image and LiDAR Data Based on Spectral Prompt Tuning
    Kong, Yi
    Cheng, Yuhu
    Chen, Yang
    Wang, Xuesong
    IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, 2024, 62
  • [26] DUAL GRAPH CONVOLUTION JOINT DENSE NETWORKS FOR HYPERSPECTRAL AND LIDAR DATA CLASSIFICATION
    Guo, Fangming
    Li, Zhongwei
    Meng, Qiao
    Wang, Leiquan
    Zhang, Jie
    2022 IEEE INTERNATIONAL GEOSCIENCE AND REMOTE SENSING SYMPOSIUM (IGARSS 2022), 2022, : 1141 - 1144
  • [27] Multi-Scale Feature Extraction for Joint Classification of Hyperspectral and LiDAR Data
    Yongqiang Xi
    Zhen Ye
    Journal of Beijing Institute of Technology, 2023, 32 (01) : 13 - 22
  • [28] MULTI-SCALE FEATURE FUSION FOR HYPERSPECTRAL AND LIDAR DATA JOINT CLASSIFICATION
    Zhang, Maqun
    Gao, Feng
    Dong, Junyu
    Qi, Lin
    2022 IEEE INTERNATIONAL GEOSCIENCE AND REMOTE SENSING SYMPOSIUM (IGARSS 2022), 2022, : 2856 - 2859
  • [29] Hyperspectral and LiDAR Data Classification Using Joint CNNs and Morphological Feature Learning
    Roy, Swalpa Kumar
    Deria, Ankur
    Hong, Danfeng
    Ahmad, Muhammad
    Plaza, Antonio
    Chanussot, Jocelyn
    IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, 2022, 60
  • [30] MAMInet II: Illumination-Insensitive Modalitywise Assimilation Guided Multistage Interaction Network for Hyperspectral and LiDAR Joint Classification
    Wei, Xintong
    Yu, Wenbo
    Huang, He
    Sun, Lin
    Zhao, Chongran
    Shen, Gangxiang
    IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, 2024, 62