A hybrid network for three-dimensional seismic fault segmentation based on nested residual attention and self-attention mechanism

被引:0
|
作者
Sun, Qifeng [1 ]
Jiang, Hui [1 ]
Du, Qizhen [2 ,3 ]
Gong, Faming [1 ]
机构
[1] China Univ Petr East China, Qingdao Inst Software, Coll Comp Sci & Technol, Qingdao, Peoples R China
[2] China Univ Petr East China, Natl Key Lab Deep Oil & Gas, Qingdao, Peoples R China
[3] Qingdao Marine Sci & Technol Ctr, Lab Marine Mineral Resources, Qingdao, Peoples R China
基金
中国国家自然科学基金;
关键词
3D; faults; interpretation;
D O I
10.1111/1365-2478.13655
中图分类号
P3 [地球物理学]; P59 [地球化学];
学科分类号
0708 ; 070902 ;
摘要
Fault detection is a crucial step in seismotectonic interpretation and oil-gas exploration. In recent years, deep learning has gradually proven to be an effective approach for detecting faults. Due to complex geological structures and seismic noise, detection results of such approaches remain unsatisfactory. In this study, we propose a hybrid network (NRA-SANet) that integrates a self-attention mechanism into a nested residual attention network for a three-dimensional seismic fault segmentation task. In NRA-SANet, the nested residual coding structure is designed to fuse multi-scale fault features, which can fully mine fine-grained fault information. The two-head self-attention decoding structure is designed to construct long-distance fault dependencies from different feature representation subspaces, which can enhance the understanding of the model regarding the global fault distribution. In order to suppress the interference of seismic noise, we propose a fault-attention module and embed it into the model. It utilizes the weighted and the separate-and-reconstruct channel strategy to improve the model sensitivity to fault areas. Experiments demonstrate that NRA-SANet exhibits strong noise robustness, while it can also detect more continuous and more small-scale faults than other approaches on field seismic data. This study provides a new idea to promote the development of seismic interpretation.
引用
收藏
页码:575 / 594
页数:20
相关论文
共 50 条
  • [41] Progressively Normalized Self-Attention Network for Video Polyp Segmentation
    Ji, Ge-Peng
    Chou, Yu-Cheng
    Fan, Deng-Ping
    Chen, Geng
    Fu, Huazhu
    Jha, Debesh
    Shao, Ling
    MEDICAL IMAGE COMPUTING AND COMPUTER ASSISTED INTERVENTION - MICCAI 2021, PT I, 2021, 12901 : 142 - 152
  • [42] Self-Attention Based Network for Punctuation Restoration
    Wang, Feng
    Chen, Wei
    Yang, Zhen
    Xu, Bo
    2018 24TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION (ICPR), 2018, : 2803 - 2808
  • [43] An Effective Dual Self-Attention Residual Network for Seizure Prediction
    Yang, Xinwu
    Zhao, Jiaqi
    Sun, Qi
    Lu, Jianbo
    Ma, Xu
    IEEE TRANSACTIONS ON NEURAL SYSTEMS AND REHABILITATION ENGINEERING, 2021, 29 : 1604 - 1613
  • [44] Image Editing via Segmentation Guided Self-Attention Network
    Zhang, Jianfu
    Yang, Peiming
    Wang, Wentao
    Hong, Yan
    Zhang, Liqing
    IEEE SIGNAL PROCESSING LETTERS, 2020, 27 : 1605 - 1609
  • [45] MRSNet: Joint consistent optic disc and cup segmentation based on large kernel residual convolutional attention and self-attention
    Yan, Shiliang
    Pan, Xiaoqin
    Wang, Yinling
    DIGITAL SIGNAL PROCESSING, 2024, 145
  • [46] Hybrid semantics-based vulnerability detection incorporating a Temporal Convolutional Network and Self-attention Mechanism
    Chen, Jinfu
    Wang, Weijia
    Liu, Bo
    Cai, Saihua
    Towey, Dave
    Wang, Shengran
    INFORMATION AND SOFTWARE TECHNOLOGY, 2024, 171
  • [47] Emotional Stress Recognition Using Electroencephalogram Signals Based on a Three-Dimensional Convolutional Gated Self-Attention Deep Neural Network
    Kim, Hyoung-Gook
    Jeong, Dong-Ki
    Kim, Jin-Young
    APPLIED SCIENCES-BASEL, 2022, 12 (21):
  • [48] Hybrid self-attention NEAT: a novel evolutionary self-attention approach to improve the NEAT algorithm in high dimensional inputs
    Khamesian, Saman
    Malek, Hamed
    EVOLVING SYSTEMS, 2024, 15 (02) : 489 - 503
  • [49] Hybrid self-attention NEAT: a novel evolutionary self-attention approach to improve the NEAT algorithm in high dimensional inputs
    Saman Khamesian
    Hamed Malek
    Evolving Systems, 2024, 15 : 489 - 503
  • [50] Three-dimensional virtual try-on network based on attention mechanism and vision transformer
    Yuan T.
    Wang X.
    Luo W.
    Mei C.
    Wei J.
    Zhong Y.
    Fangzhi Xuebao/Journal of Textile Research, 2023, 44 (07): : 192 - 198