A simple remedy for failure modes in physics informed neural networks

被引:0
|
作者
Farhani, Ghazal [1 ]
Dashtbayaz, Nima Hosseini [2 ]
Kazachek, Alexander [3 ]
Wang, Boyu [2 ,4 ]
机构
[1] Natl Res Council Canada, Automot & Surface Transportat, 800 Collip Cir, London, ON N6G 4X8, Canada
[2] Western Univ, Dept Comp Sci, Middlesex Coll, 1151 Richmond St, London, ON N6A 5B7, Canada
[3] Western Univ, Middlesex Coll, Dept Math, 1151 Richmond St, London, ON N6A 5B7, Canada
[4] Vector Inst, 661 Univ Ave,Suite 710, Toronto, ON M5G 1M1, Canada
基金
加拿大自然科学与工程研究理事会;
关键词
Physics-informed neural networks; Optimization; Failure modes in PINNs;
D O I
10.1016/j.neunet.2024.106963
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Physics-informed neural networks (PINNs) have shown promising results in solving a wide range of problems involving partial differential equations (PDEs). Nevertheless, there are several instances of the failure of PINNs when PDEs become more complex. Particularly, when PDE coefficients grow larger or PDEs become increasingly nonlinear, PINNs struggle to converge to the true solution. A noticeable discrepancy emerges in the convergence speed between the PDE loss and the initial/boundary conditions loss, leading to the inability of PINNs to effectively learn the true solutions to these PDEs. In the present work, leveraging the neural tangent kernels (NTKs), we investigate the training dynamics of PINNs. Our theoretical analysis reveals that when PINNs are trained using gradient descent with momentum (GDM), the gap in convergence rates between the two loss terms is significantly reduced, thereby enabling the learning of the exact solution. We also examine why training a model via the Adam optimizer can accelerate the convergence and reduce the effect of the mentioned discrepancy. Our numerical experiments validate that sufficiently wide networks trained with GDM and Adam yield desirable solutions for more complex PDEs.
引用
收藏
页数:13
相关论文
共 50 条
  • [1] Characterizing possible failure modes in physics-informed neural networks
    Krishnapriyan, Aditi S.
    Gholami, Amir
    Zhe, Shandian
    Kirby, Robert M.
    Mahoney, Michael W.
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021, 34
  • [2] Critical Investigation of Failure Modes in Physics-informed Neural Networks
    Basir, Shamsulhaq
    Senocak, Inane
    AIAA SCITECH 2022 FORUM, 2022,
  • [3] Investigating and Mitigating Failure Modes in Physics-Informed Neural Networks (PINNs)
    Basir, Shamsulhaq
    COMMUNICATIONS IN COMPUTATIONAL PHYSICS, 2023, 33 (05) : 1240 - 1269
  • [4] Using physics-informed neural networks to compute quasinormal modes
    Cornell, Alan S.
    Ncube, Anele
    Harmsen, Gerhard
    PHYSICAL REVIEW D, 2022, 106 (12)
  • [5] Quasinormal modes in modified gravity using physics-informed neural networks
    Luna, Raimon
    Doneva, Daniela D.
    Font, Jose A.
    Lien, Jr-Hua
    Yazadjiev, Stoytcho S.
    PHYSICAL REVIEW D, 2024, 109 (12)
  • [6] Conditional physics informed neural networks
    Kovacs, Alexander
    Exl, Lukas
    Kornell, Alexander
    Fischbacher, Johann
    Hovorka, Markus
    Gusenbauer, Markus
    Breth, Leoni
    Oezelt, Harald
    Yano, Masao
    Sakuma, Noritsugu
    Kinoshita, Akihito
    Shoji, Tetsuya
    Kato, Akira
    Schrefl, Thomas
    COMMUNICATIONS IN NONLINEAR SCIENCE AND NUMERICAL SIMULATION, 2022, 104
  • [7] Enforcing Dirichlet boundary conditions in physics-informed neural networks and variational physics-informed neural networks
    Berrone, S.
    Canuto, C.
    Pintore, M.
    Sukumar, N.
    HELIYON, 2023, 9 (08)
  • [8] Simple yet effective adaptive activation functions for physics-informed neural networks
    Zhang, Jun
    Ding, Chensen
    COMPUTER PHYSICS COMMUNICATIONS, 2025, 307
  • [9] Physics informed neural networks for continuum micromechanics
    Henkes, Alexander
    Wessels, Henning
    Mahnken, Rolf
    COMPUTER METHODS IN APPLIED MECHANICS AND ENGINEERING, 2022, 393
  • [10] Separable Physics-Informed Neural Networks
    Cho, Junwoo
    Nam, Seungtae
    Yang, Hyunmo
    Yun, Seok-Bae
    Hong, Youngjoon
    Park, Eunbyung
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36 (NEURIPS 2023), 2023,