Self-potential inversion based on Attention U-Net deep learning network

被引:1
|
作者
Guo, You-jun [1 ,3 ,4 ]
Cui, Yi-an [1 ,3 ]
Chen, Hang [2 ]
Xie, Jing [1 ,5 ]
Zhang, Chi [4 ]
Liu, Jian-xin [1 ,3 ]
机构
[1] Cent South Univ, Sch Geosci & Info Phys, Changsha 410083, Peoples R China
[2] Boise State Univ, Dept Geosci, Boise, ID 83725 USA
[3] Key Lab Nonferrous & Geol Hazard Detect, Changsha 410083, Peoples R China
[4] Univ Vienna, Dept Meteorol & Geophys, A-1090 Vienna, Austria
[5] Peking Univ, Sch Earth & Space Sci, Beijing 100871, Peoples R China
基金
中国国家自然科学基金;
关键词
self-potential; attention mechanism; U-Net deep learning network; inversion; landfill; (sic)(sic)(sic)(sic); (sic)(sic)(sic)(sic)(sic); U-Net(sic)(sic); (sic)(sic); LANDFILL; RESISTIVITY; PLUMES;
D O I
10.1007/s11771-024-5755-8
中图分类号
TF [冶金工业];
学科分类号
0806 ;
摘要
Landfill leaks pose a serious threat to environmental health, risking the contamination of both groundwater and soil resources. Accurate investigation of these sites is essential for implementing effective prevention and control measures. The self-potential (SP) stands out for its sensitivity to contamination plumes, offering a solution for monitoring and detecting the movement and seepage of subsurface pollutants. However, traditional SP inversion techniques heavily rely on precise subsurface resistivity information. In this study, we propose the Attention U-Net deep learning network for rapid SP inversion. By incorporating an attention mechanism, this algorithm effectively learns the relationship between array-style SP data and the location and extent of subsurface contaminated sources. We designed a synthetic landfill model with a heterogeneous resistivity structure to assess the performance of Attention U-Net deep learning network. Additionally, we conducted further validation using a laboratory model to assess its practical applicability. The results demonstrate that the algorithm is not solely dependent on resistivity information, enabling effective locating of the source distribution, even in models with intricate subsurface structures. Our work provides a promising tool for SP data processing, enhancing the applicability of this method in the field of near-subsurface environmental monitoring. (sic)(sic)(sic)(sic)(sic)(sic)(sic)(sic)(sic)(sic)(sic)(sic)(sic)(sic)(sic)(sic)(sic)(sic)(sic)(sic)(sic)(sic)(sic)(sic)(sic), (sic)(sic)(sic)(sic)(sic)(sic)(sic)(sic)(sic)(sic)(sic)(sic)(sic)(sic)(sic) (sic)(sic)(sic).(sic)(sic)(sic)(sic)(sic)(sic)(sic)(sic)(sic)(sic)(sic)(sic)(sic)(sic)(sic), (sic)(sic)(sic)(sic)(sic)(sic)(sic)(sic),(sic)(sic)(sic)(sic)(sic)(sic)(sic)(sic)(sic)(sic)(sic)(sic)(sic).(sic) (sic), (sic)(sic)(sic)(sic)(sic)(sic)(sic)(sic)(sic)(sic)(sic)(sic)(sic)(sic)(sic)(sic)(sic)(sic)(sic)(sic)(sic)(sic)(sic)(sic)(sic)(sic)(sic).(sic)(sic)(sic)(sic)(sic)(sic)(sic)(sic), (sic)(sic)(sic)(sic)(sic) (sic)(sic), (sic)(sic)(sic)(sic)(sic)Attention U-Net (sic)(sic)(sic)(sic)(sic)(sic)(sic)(sic)(sic)(sic)(sic)(sic)(sic)(sic)(sic)(sic)(sic), (sic)(sic)(sic)(sic)(sic)(sic)(sic)(sic)(sic)(sic)(sic). (sic)(sic)(sic)(sic)(sic)U(sic)(sic)(sic)(sic)(sic)(sic)(sic)(sic), (sic)(sic)(sic)(sic)(sic)(sic)(sic)(sic)(sic)(sic)(sic)(sic)(sic)(sic)(sic)(sic)(sic)(sic)(sic)(sic)(sic)(sic)(sic)(sic)(sic)(sic).(sic)(sic) (sic)(sic)(sic)(sic)(sic)(sic)(sic)(sic)(sic)(sic)(sic)(sic)(sic)(sic)(sic)(sic)(sic)(sic)(sic)(sic)(sic)(sic)(sic)(sic), (sic)(sic)(sic)(sic)(sic)(sic)(sic)(sic)(sic), (sic)(sic)(sic)(sic)(sic)(sic)(sic)(sic)(sic) (sic)(sic)(sic)(sic)(sic)(sic)(sic)(sic)(sic)(sic)(sic)(sic)(sic)(sic).(sic)(sic)(sic)(sic), Attention U-Net (sic)(sic)(sic)(sic)(sic)(sic)(sic)(sic)(sic)(sic)(sic)(sic), (sic)(sic)(sic)(sic) (sic)(sic)(sic)(sic)(sic)(sic)(sic)(sic)(sic)(sic)(sic)(sic)(sic)(sic)(sic)(sic)(sic)(sic)(sic)(sic)(sic)(sic)(sic)(sic).(sic)(sic)(sic)(sic)(sic)(sic)(sic)(sic)(sic)(sic)(sic)(sic)(sic)(sic)(sic)(sic)(sic)(sic)(sic) (sic)(sic)(sic), (sic)(sic)(sic)(sic)(sic)(sic)(sic)(sic)(sic)(sic)(sic)(sic)(sic)(sic)(sic)(sic)(sic).
引用
收藏
页码:3156 / 3167
页数:12
相关论文
共 50 条
  • [1] Gravity data density interface inversion based on U-net deep learning network
    Li Yang
    Han LiGuo
    Zhou Shuai
    Lin Tao
    CHINESE JOURNAL OF GEOPHYSICS-CHINESE EDITION, 2023, 66 (01): : 401 - 411
  • [2] Multichannel seismic impedance inversion based on Attention U-Net
    Ning, Juan
    Li, Shu
    Wei, Zong
    Yang, Xi
    FRONTIERS IN EARTH SCIENCE, 2023, 11
  • [3] Enhancing seismic resolution based on U-Net deep learning network
    Li, Zeyu
    Wang, Guoquan
    Zhu, Chenghong
    Chen, Shuangquan
    JOURNAL OF SEISMIC EXPLORATION, 2023, 32 (04): : 315 - 336
  • [4] Improved Photoacoustic Imaging of Numerical Bone Model Based on Attention Block U-Net Deep Learning Network
    Chen, Panpan
    Liu, Chengcheng
    Feng, Ting
    Li, Yong
    Ta, Dean
    APPLIED SCIENCES-BASEL, 2020, 10 (22): : 1 - 18
  • [5] Seismic data fault detection based on U-Net deep learning network
    Yang W.
    Yang J.
    Chen S.
    Kuang L.
    Wang E.
    Zhou C.
    Shiyou Diqiu Wuli Kantan/Oil Geophysical Prospecting, 2021, 56 (04): : 688 - 697
  • [6] AUTL: An Attention U-Net Transfer Learning Inversion Framework for Magnetotelluric Data
    Gao, Ci
    Li, Yabin
    Wang, Xueqiu
    IEEE GEOSCIENCE AND REMOTE SENSING LETTERS, 2024, 21
  • [7] Deep Learning Based Model Observer by U-Net
    Lorente, Iris
    Abbey, Craig
    Brankov, Jovan G.
    MEDICAL IMAGING 2020: IMAGE PERCEPTION, OBSERVER PERFORMANCE, AND TECHNOLOGY ASSESSMENT, 2020, 11316
  • [8] Multitask Full Attention U-Net for Prestack Seismic Inversion
    Liu, Xudong
    Wu, Bangyu
    Yang, Hui
    IEEE GEOSCIENCE AND REMOTE SENSING LETTERS, 2023, 20
  • [9] Residual u-net with Self-Attention based deep convolutional adaptive capsule network for liver cancer segmentation and classification
    Archana, R.
    Anand, L.
    BIOMEDICAL SIGNAL PROCESSING AND CONTROL, 2025, 105
  • [10] A Feature Attention Dehazing Network based on U-Net and Dense Connection
    Jing, Hongyuan
    Zha, Quanxing
    Fu, Yiran
    Lv, Hejun
    Chen, Aidong
    THIRTEENTH INTERNATIONAL CONFERENCE ON GRAPHICS AND IMAGE PROCESSING (ICGIP 2021), 2022, 12083