Residual multi-fidelity neural network computing

被引:0
|
作者
Davis, Owen [1 ]
Motamed, Mohammad [1 ]
Tempone, Raul [2 ,3 ]
机构
[1] Univ New Mexico, Dept Math & Stat, Albuquerque, NM 87106 USA
[2] King Abdullah Univ Sci & Technol KAUST, Comp Elect & Math Sci & Engn Div CEMSE, Thuwal, Saudi Arabia
[3] Rhein Westfal TH Aachen, Aachen, Germany
关键词
Multi-fidelity computing; Residual modeling; Deep neural networks; Uncertainty quantification; MONTE-CARLO METHOD; ELLIPTIC PDES; APPROXIMATION; ALGORITHM; BOUNDS;
D O I
10.1007/s10543-025-01058-9
中图分类号
TP31 [计算机软件];
学科分类号
081202 ; 0835 ;
摘要
In this work, we consider the general problem of constructing a neural network surrogate model using multi-fidelity information. Motivated by error-complexity estimates for ReLU neural networks, we formulate the correlation between an inexpensive low-fidelity model and an expensive high-fidelity model as a possibly non-linear residual function. This function defines a mapping between (1) the shared input space of the models along with the low-fidelity model output, and (2) the discrepancy between the outputs of the two models. The computational framework proceeds by training two neural networks to work in concert. The first network learns the residual function on a small set of high- and low-fidelity data. Once trained, this network is used to generate additional synthetic high-fidelity data, which is used in the training of the second network. The trained second network then acts as our surrogate for the high-fidelity quantity of interest. We present four numerical examples to demonstrate the power of the proposed framework, showing that significant savings in computational cost may be achieved when the output predictions are desired to be accurate within small tolerances.
引用
收藏
页数:35
相关论文
共 50 条
  • [31] Multi-fidelity meta modeling using composite neural network with online adaptive basis technique
    Ahn, Jun-Geol
    Yang, Hyun-Ik
    Kim, Jin-Gyun
    COMPUTER METHODS IN APPLIED MECHANICS AND ENGINEERING, 2022, 388
  • [32] Multi-fidelity surrogate modeling for temperature field prediction using deep convolution neural network
    Zhang, Yunyang
    Gong, Zhiqiang
    Zhou, Weien
    Zhao, Xiaoyu
    Zheng, Xiaohu
    Yao, Wen
    ENGINEERING APPLICATIONS OF ARTIFICIAL INTELLIGENCE, 2023, 123
  • [33] A Deep Neural Network, Multi-fidelity Surrogate Model Approach for Bayesian Model Updating in SHM
    Torzoni, Matteo
    Manzoni, Andrea
    Mariani, Stefano
    EUROPEAN WORKSHOP ON STRUCTURAL HEALTH MONITORING (EWSHM 2022), VOL 2, 2023, : 1076 - 1086
  • [34] A novel multi-fidelity neural network for response prediction using rotor dynamics and model reduction
    Khamari, Debanshu S.
    Behera, Suraj K.
    JOURNAL OF THE BRAZILIAN SOCIETY OF MECHANICAL SCIENCES AND ENGINEERING, 2023, 45 (11)
  • [35] Multi-fidelity convolutional neural network surrogate model for aerodynamic optimization based on transfer learning
    Liao, Peng
    Song, Wei
    Du, Peng
    Zhao, Hang
    PHYSICS OF FLUIDS, 2021, 33 (12)
  • [36] A novel multi-fidelity neural network for response prediction using rotor dynamics and model reduction
    Debanshu S. Khamari
    Suraj K. Behera
    Journal of the Brazilian Society of Mechanical Sciences and Engineering, 2023, 45
  • [37] Magnetic Properties Identification by Using a Bi-Objective Optimal Multi-Fidelity Neural Network
    Baldan, Marco
    Di Barba, Paolo
    Nacke, Bernard
    IEEE TRANSACTIONS ON MAGNETICS, 2021, 57 (06)
  • [38] A DeepONet multi-fidelity approach for residual learning in reduced order modeling
    Nicola Demo
    Marco Tezzele
    Gianluigi Rozza
    Advanced Modeling and Simulation in Engineering Sciences, 10
  • [39] A DeepONet multi-fidelity approach for residual learning in reduced order modeling
    Demo, Nicola
    Tezzele, Marco
    Rozza, Gianluigi
    ADVANCED MODELING AND SIMULATION IN ENGINEERING SCIENCES, 2023, 10 (01)
  • [40] Multi-Fidelity Bayesian Optimization via Deep Neural Networks
    Li, Shibo
    Xing, Wei
    Kirby, Robert M.
    Zhe, Shandian
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 33, NEURIPS 2020, 2020, 33