A spectrum contextual self-attention deep learning network for hyperspectral inversion of soil metals

被引:9
|
作者
Zhang, Tingyu [1 ,2 ]
Fu, Quan [3 ]
Tian, Runqing [3 ]
Zhang, Yang [1 ,2 ]
Sun, Zenghui [1 ,2 ]
机构
[1] Minist Nat Resources, Key Lab Degraded & Unused Land Consolidat Engn, Xian, Shaanxi, Peoples R China
[2] Inst Land Engn & Technol, Shaanxi Prov Land Engn Construct Grp, Xian, Shaanxi, Peoples R China
[3] Shaanxi Prov Land Engn Construct Grp Land Survey P, Xian, Shaanxi, Peoples R China
关键词
Hyperspectral inversion; Self; -attention; Metal content; Spectral context;
D O I
10.1016/j.ecolind.2023.110351
中图分类号
X176 [生物多样性保护];
学科分类号
090705 ;
摘要
The retrieval of heavy metal concentrations in naturally contaminated arable soils through hyperspectral reflectance has become increasingly significant in recent years. Presently, both conventional and deep learningoriented metal inversion methodologies mandate the development of individualized models for each metallic element, while refraining from leveraging distant dependencies to attain the inherent heterogeneity unique to each metal species. This intricacy renders the attainment of precise inversion outcomes for particular heavy metal concentrations a formidable challenge. To tackle this challenge, the current study introduces the Spectrum Contextual Self-Attention Deep Learning Network (SCSANet), which is created to encompass long-range spectral context dependencies by employing a self-attention network. The model also includes efficient and precise spectral input techniques, as well as simultaneous output of multiple metals. Experiments are carried out in the study area to assess the precision of the proposed model in identifying lead (Pb), copper (Cu), cadmium (Cd), and mercury (Hg) concentrations. The findings reveal that metal inversion does not have a significant impact on preprocessed spectra, and enhancing the input technique of neighbourhood spectra can boost the accuracy of the inversion. The SCSANet model proposed herein achieves the highest inversion accuracy for metals with a similar magnitude of content, and outperforms the compared method in terms of inversion accuracy.
引用
收藏
页数:12
相关论文
共 50 条
  • [41] Compressive sensing image reconstruction based on deep unfolding self-attention network
    Tian, Jin-Peng
    Hou, Bao-Jun
    Jilin Daxue Xuebao (Gongxueban)/Journal of Jilin University (Engineering and Technology Edition), 2024, 54 (10): : 3018 - 3026
  • [42] Attribute Network Representation Learning Based on Generative Adversarial Network and Self-attention Mechanism
    Li, Shanshan
    Tang, Meiling
    Dong, Yingnan
    International Journal of Network Security, 2024, 26 (01) : 51 - 58
  • [43] Dual resolution deep learning network with self-attention mechanism for classification and localisation of colorectal cancer in histopathological images
    Xu, Yan
    Jiang, Liwen
    Huang, Shuting
    Liu, Zhenyu
    Zhang, Jiangyu
    JOURNAL OF CLINICAL PATHOLOGY, 2023, 76 (08) : 524 - 530
  • [44] Learning model combining convolutional deep neural network with a self-attention mechanism for AC optimal power flow
    Tran, Quan
    Mitra, Joydeep
    Nguyen, Nga
    ELECTRIC POWER SYSTEMS RESEARCH, 2024, 231
  • [45] DSSFN: A Dual-Stream Self-Attention Fusion Network for Effective Hyperspectral Image Classification
    Yang, Zian
    Zheng, Nairong
    Wang, Feng
    REMOTE SENSING, 2023, 15 (15)
  • [46] GlobalMind: Global multi-head interactive self-attention network for hyperspectral change detection
    Hu, Meiqi
    Wu, Chen
    Zhang, Liangpei
    ISPRS JOURNAL OF PHOTOGRAMMETRY AND REMOTE SENSING, 2024, 211 : 465 - 483
  • [47] A novel self-attention deep subspace clustering
    Chen, Zhengfan
    Ding, Shifei
    Hou, Haiwei
    INTERNATIONAL JOURNAL OF MACHINE LEARNING AND CYBERNETICS, 2021, 12 (08) : 2377 - 2387
  • [48] Feature-Level Fusion Network for Hyperspectral Object Tracking via Mixed Multi-Head Self-Attention Learning
    Gao, Long
    Chen, Langkun
    Jiang, Yan
    Xi, Bobo
    Xie, Weiying
    Li, Yunsong
    REMOTE SENSING, 2025, 17 (06)
  • [49] Deep Semantic Role Labeling with Self-Attention
    Tan, Zhixing
    Wang, Mingxuan
    Xie, Jun
    Chen, Yidong
    Shi, Xiaodong
    THIRTY-SECOND AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE / THIRTIETH INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE / EIGHTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2018, : 4929 - 4936
  • [50] A hybrid self-attention deep learning framework for multivariate sleep stage classification
    Yuan, Ye
    Jia, Kebin
    Ma, Fenglong
    Xun, Guangxu
    Wang, Yaqing
    Su, Lu
    Zhang, Aidong
    BMC BIOINFORMATICS, 2019, 20 (Suppl 16)