A simple remedy for failure modes in physics informed neural networks

被引:0
|
作者
Farhani, Ghazal [1 ]
Dashtbayaz, Nima Hosseini [2 ]
Kazachek, Alexander [3 ]
Wang, Boyu [2 ,4 ]
机构
[1] Natl Res Council Canada, Automot & Surface Transportat, 800 Collip Cir, London, ON N6G 4X8, Canada
[2] Western Univ, Dept Comp Sci, Middlesex Coll, 1151 Richmond St, London, ON N6A 5B7, Canada
[3] Western Univ, Middlesex Coll, Dept Math, 1151 Richmond St, London, ON N6A 5B7, Canada
[4] Vector Inst, 661 Univ Ave,Suite 710, Toronto, ON M5G 1M1, Canada
基金
加拿大自然科学与工程研究理事会;
关键词
Physics-informed neural networks; Optimization; Failure modes in PINNs;
D O I
10.1016/j.neunet.2024.106963
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Physics-informed neural networks (PINNs) have shown promising results in solving a wide range of problems involving partial differential equations (PDEs). Nevertheless, there are several instances of the failure of PINNs when PDEs become more complex. Particularly, when PDE coefficients grow larger or PDEs become increasingly nonlinear, PINNs struggle to converge to the true solution. A noticeable discrepancy emerges in the convergence speed between the PDE loss and the initial/boundary conditions loss, leading to the inability of PINNs to effectively learn the true solutions to these PDEs. In the present work, leveraging the neural tangent kernels (NTKs), we investigate the training dynamics of PINNs. Our theoretical analysis reveals that when PINNs are trained using gradient descent with momentum (GDM), the gap in convergence rates between the two loss terms is significantly reduced, thereby enabling the learning of the exact solution. We also examine why training a model via the Adam optimizer can accelerate the convergence and reduce the effect of the mentioned discrepancy. Our numerical experiments validate that sufficiently wide networks trained with GDM and Adam yield desirable solutions for more complex PDEs.
引用
收藏
页数:13
相关论文
共 50 条
  • [31] Robust Variational Physics-Informed Neural Networks
    Rojas, Sergio
    Maczuga, Pawel
    Muñoz-Matute, Judit
    Pardo, David
    Paszyński, Maciej
    Computer Methods in Applied Mechanics and Engineering, 2024, 425
  • [32] Physics-informed neural networks for periodic flows
    Shah, Smruti
    Anand, N. K.
    PHYSICS OF FLUIDS, 2024, 36 (07)
  • [33] A class of improved fractional physics informed neural networks
    Ren, Hongpeng
    Meng, Xiangyun
    Liu, Rongrong
    Hou, Jian
    Yu, Yongguang
    NEUROCOMPUTING, 2023, 562
  • [34] Physics-informed neural networks for diffraction tomography
    Amirhossein Saba
    Carlo Gigli
    Ahmed B.Ayoub
    Demetri Psaltis
    Advanced Photonics, 2022, 4 (06) : 48 - 59
  • [35] On physics-informed neural networks for quantum computers
    Markidis, Stefano
    FRONTIERS IN APPLIED MATHEMATICS AND STATISTICS, 2022, 8
  • [36] Physics-Informed Neural Networks for shell structures
    Bastek, Jan-Hendrik
    Kochmann, Dennis M.
    EUROPEAN JOURNAL OF MECHANICS A-SOLIDS, 2023, 97
  • [37] fPINNs: FRACTIONAL PHYSICS-INFORMED NEURAL NETWORKS
    Pang, Guofei
    Lu, Lu
    Karniadakis, George E. M.
    SIAM JOURNAL ON SCIENTIFIC COMPUTING, 2019, 41 (04): : A2603 - A2626
  • [38] Transfer physics informed neural network: a new framework for distributed physics informed neural networks via parameter sharing
    Manikkan, Sreehari
    Srinivasan, Balaji
    ENGINEERING WITH COMPUTERS, 2023, 39 (04) : 2961 - 2988
  • [39] Transfer physics informed neural network: a new framework for distributed physics informed neural networks via parameter sharing
    Sreehari Manikkan
    Balaji Srinivasan
    Engineering with Computers, 2023, 39 : 2961 - 2988
  • [40] Parallel Physics-Informed Neural Networks with Bidirectional Balance
    Huang, Yuhao
    Xu, Jiarong
    Fang, Shaomei
    Zhu, Zupeng
    Jiang, Linfeng
    Liang, Xiaoxin
    6TH INTERNATIONAL CONFERENCE ON INNOVATION IN ARTIFICIAL INTELLIGENCE, ICIAI2022, 2022, : 23 - 30