Estimation of missing logs by regularized neural networks

被引:30
|
作者
Saggaf, MM [1 ]
Nebrija, EL [1 ]
机构
[1] Saudi Aramco, Dhahran 31311, Saudi Arabia
关键词
D O I
10.1306/03110301030
中图分类号
P [天文学、地球科学];
学科分类号
07 ;
摘要
An approach based on regularized back-propagation neural networks can be used to estimate the missing logs, or parts of those logs, in wells with incomplete log suites. This is done by first analyzing the interdependence of the various log types in a training well that has a complete suite of logs, and then applying the network to nearby wells whose log suites are incomplete to estimate the missing logs in these wells. The accuracy of the method is evaluated by blind tests conducted on real well-log data. These tests indicate that the method produces accurate estimates that are close to the measured log values, and the method can thus be an effective means of enhancing limited suites of wire-line logs. Moreover, this approach has several advantages over the ad hoc practice of manually patching the missing logs from the complete log suites of proximate wells because it is automatic, objective, completely data driven, inherently nonlinear, and does not suffer from the overfitting difficulties commonly associated with conventional back-propagation networks. Additionally, it seems that an accurate selection of the optimal input log types is not necessary because redundant input containing several logs yields reasonably accurate results as long as some of the logs in the input are sufficiently correlated with the missing log.
引用
收藏
页码:1377 / 1389
页数:13
相关论文
共 50 条
  • [21] High-Dimensional Learning Under Approximate Sparsity with Applications to Nonsmooth Estimation and Regularized Neural Networks
    Liu, Hongcheng
    Ye, Yinyu
    Lee, Hung Yi
    OPERATIONS RESEARCH, 2022, 70 (06) : 3176 - 3197
  • [22] R-Drop: Regularized Dropout for Neural Networks
    Liang, Xiaobo
    Wu, Lijun
    Li, Juntao
    Wang, Yue
    Meng, Qi
    Qin, Tao
    Chen, Wei
    Zhang, Min
    Liu, Tie-Yan
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021, 34
  • [23] LDMNet: Low Dimensional Manifold Regularized Neural Networks
    Zhu, Wei
    Qiu, Qiang
    Huang, Jiaji
    Calderbank, Robert
    Sapiro, Guillermo
    Daubechies, Ingrid
    2018 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2018, : 2743 - 2751
  • [24] Periodic Signal Recovery with Regularized Sine Neural Networks
    Robin, David A. R.
    Scaman, Kevin
    Lelarge, Marc
    NEURIPS WORKSHOP ON SYMMETRY AND GEOMETRY IN NEURAL REPRESENTATIONS, VOL 197, 2022, 197 : 98 - 110
  • [25] REGULARIZED NEURAL NETWORKS - SOME CONVERGENCE RATE RESULTS
    CORRADI, V
    WHITE, H
    NEURAL COMPUTATION, 1995, 7 (06) : 1225 - 1244
  • [26] Linear Regularized Compression of Deep Convolutional Neural Networks
    Ceruti, Claudio
    Campadelli, Paola
    Casiraghi, Elena
    IMAGE ANALYSIS AND PROCESSING,(ICIAP 2017), PT I, 2017, 10484 : 244 - 253
  • [27] Customer Prediction using Parking Logs with Recurrent Neural Networks
    Mudassar L.
    Byun Y.C.
    International Journal of Networked and Distributed Computing, 2018, 6 (3) : 133 - 142
  • [28] Using artificial neural networks to generate synthetic well logs
    Rolon, Luisa
    Mohaghegh, Shahab D.
    Ameri, Sam
    Gaskari, Razi
    McDaniel, Bret
    JOURNAL OF NATURAL GAS SCIENCE AND ENGINEERING, 2009, 1 (4-5) : 118 - 133
  • [29] Synthetic well logs generation via Recurrent Neural Networks
    Zhang Dongxiao
    Chen Yuntian
    Meng Jin
    PETROLEUM EXPLORATION AND DEVELOPMENT, 2018, 45 (04) : 629 - 639
  • [30] Recurrent neural networks for missing or asynchronous data
    Bengio, Y
    Gingras, F
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 8: PROCEEDINGS OF THE 1995 CONFERENCE, 1996, 8 : 395 - 401