An Improved Method for Physics-Informed Neural Networks That Accelerates Convergence

被引:5
|
作者
Yan, Liangliang [1 ,2 ]
Zhou, You [1 ,2 ]
Liu, Huan [3 ]
Liu, Lingqi [1 ,2 ]
机构
[1] Chengdu Univ Technol, Planetary Sci Res Ctr, Chengdu 610059, Peoples R China
[2] Chengdu Univ Technol, Sch Comp & Secur, Chengdu 610059, Peoples R China
[3] Jinggangshan Univ, Coll Elect & Informat Engn, Jian 343900, Jiangxi, Peoples R China
基金
中国国家自然科学基金;
关键词
Physics-informed neural network; partial differential equations; multi-input residual network; convergence speed; unsupervised learning;
D O I
10.1109/ACCESS.2024.3354058
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Physics-Informed Neural Networks (PINNs) have proven highly effective for solving high-dimensional Partial Differential Equations (PDEs), having demonstrated tremendous potential in a variety of challenging scenarios. However, traditional PINNs (vanilla PINNs), typically based on fully connected neural networks (FCNN), often face issues with convergence and parameter redundancy. This paper proposes a novel approach that utilizes a multi-input residual network, incorporating a multi-step training paradigm to facilitate unsupervised training. This improved method, which we named MultiInNet PINNs, can enhance the convergence speed and the stability of traditional PINNs. Our experiments demonstrate that MultiInNet PINNs achieve better convergence with fewer parameters than other networks like FCNN, ResNet, and UNet. Specifically, the multi-step training increases convergence speed by approximately 45%, while the MultiInNet enhancement contributes an additional 50%, leading to a total improvement of about 70%. This accelerated convergence speed allows PINNs to lower computational costs by achieving faster convergence. Moreover, our MultiInNet PINNs provides a potential method for handling initial and boundary conditions (I/BCs) separately within PINNs.
引用
收藏
页码:23943 / 23953
页数:11
相关论文
共 50 条
  • [1] Improved Training of Physics-Informed Neural Networks with Model Ensembles
    Haitsiukevich, Katsiaryna
    Ilin, Alexander
    2023 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, IJCNN, 2023,
  • [2] Enforcing Dirichlet boundary conditions in physics-informed neural networks and variational physics-informed neural networks
    Berrone, S.
    Canuto, C.
    Pintore, M.
    Sukumar, N.
    HELIYON, 2023, 9 (08)
  • [3] Accelerating the convergence of physics-informed neural networks for seismic wave simulation
    Zou, Jingbo
    Liu, Cai
    Wang, Yanghua
    Song, Chao
    bin Waheed, Umair
    Zhao, Pengfei
    GEOPHYSICS, 2025, 90 (02) : T23 - T32
  • [4] Separable Physics-Informed Neural Networks
    Cho, Junwoo
    Nam, Seungtae
    Yang, Hyunmo
    Yun, Seok-Bae
    Hong, Youngjoon
    Park, Eunbyung
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36 (NEURIPS 2023), 2023,
  • [5] Quantum Physics-Informed Neural Networks
    Trahan, Corey
    Loveland, Mark
    Dent, Samuel
    ENTROPY, 2024, 26 (08)
  • [6] Improved physics-informed neural networks for the reinterpreted discrete fracture model
    Wang, Chao
    Guo, Hui
    Yan, Xia
    Shi, Zhang-Lei
    Yang, Yang
    JOURNAL OF COMPUTATIONAL PHYSICS, 2025, 520
  • [7] Adaptive activation functions accelerate convergence in deep and physics-informed neural networks
    Jagtap, Ameya D.
    Kawaguchi, Kenji
    Karniadakis, George Em
    JOURNAL OF COMPUTATIONAL PHYSICS, 2020, 404 (404)
  • [8] SOBOLEV TRAINING FOR PHYSICS-INFORMED NEURAL NETWORKS
    Son, Hwijae
    Jang, Jin woo
    Han, Woo jin
    Hwang, Hyung ju
    COMMUNICATIONS IN MATHEMATICAL SCIENCES, 2023, 21 (06) : 1679 - 1705
  • [9] Enhanced physics-informed neural networks for hyperelasticity
    Abueidda, Diab W.
    Koric, Seid
    Guleryuz, Erman
    Sobh, Nahil A.
    INTERNATIONAL JOURNAL FOR NUMERICAL METHODS IN ENGINEERING, 2023, 124 (07) : 1585 - 1601
  • [10] Physics-informed neural networks for diffraction tomography
    Saba, Amirhossein
    Gigli, Carlo
    Ayoub, Ahmed B.
    Psaltis, Demetri
    ADVANCED PHOTONICS, 2022, 4 (06):