Learning in sinusoidal spaces with physics-informed neural networks

被引:26
|
作者
Wong J.C. [1 ,2 ,4 ]
Ooi C.C. [1 ,4 ]
Gupta A. [3 ,4 ]
Ong Y.S. [4 ,5 ]
机构
[1] Institute of High Performance Computing, Agency for Science Technology and Research, Singapore
[2] School of Computer Science and Engineering, Nanyang Technological University, Singapore
[3] Singapore Institute of Manufacturing Technology, Agency for Science Technology and Research, Singapore
[4] Agency for Science Technology and Research, Singapore
[5] Data Science and Artificial Intelligence Research Centre, School of Computer Science and Engineering, Nanyang Technological University, Singapore
来源
关键词
Differential equations; physics-informed neural networks (PINNs); sinusoidal spaces;
D O I
10.1109/TAI.2022.3192362
中图分类号
学科分类号
摘要
A physics-informed neural network (PINN) uses physics-Augmented loss functions, e.g., incorporating the residual term from governing partial differential equations (PDEs), to ensure its output is consistent with fundamental physics laws. However, it turns out to be difficult to train an accurate PINN model for many problems in practice. In this article, we present a novel perspective of the merits of learning in sinusoidal spaces with PINNs. By analyzing behavior at model initialization, we first show that a PINN of increasing expressiveness induces an initial bias around flat output functions. Notably, this initial solution can be very close to satisfying many physics PDEs, i.e., falling into a localminimum of the PINN loss that onlyminimizes PDE residuals, while still being far from the true solution that jointly minimizes PDE residuals and the initial and/or boundary conditions. It is difficult for gradient descent optimization to escape from such a local minimum trap, often causing the training to stall. We then prove that the sinusoidalmapping of inputs-in an architecture we label as sf-PINN-is effective to increase input gradient variability, thus avoiding being trapped in such deceptive local minimum. The level of variability can be effectively modulated to match high-frequency patterns in the problem at hand. A key facet of this article is the comprehensive empirical study that demonstrates the efficacy of learning in sinusoidal spaces with PINNs for a wide range of forward and inversemodeling problems spanning multiple physics domains. Impact Statement-Falling under the emerging field of physicsinformed machine learning, PINN models have tremendous potential as a unifying AI framework for assimilating physics theory and measurement data. However, they remain infeasible for broad science and engineering applications due to computational cost and training challenges, especially for more complex problems. Instead of focusing on empirical demonstration of applicability to a specific problem, this paper analyzed and provided novel perspectives on why a typical PINN implementation frequently exhibits training difficulty, even on a seemingly straightforward 1D model problem. Interestingly, the issues uncovered explain how common heuristics for data-fit neural networks (e.g. initialization) cause training difficulties in this new paradigm of PINNs. Critically, we believe this to be among the first work to analytically provide insight into how simply adapting conventional neural network implementations may be broadly problematic for PINNs across different PDEs (physics). Guided by this insight, we then further provide theoretical results on the benefit of learning in sinusoidal space with PINNs. Utilizing sinusoidal spaces initializes sf-PINNs with input gradient distributions that can overcome the trainability issues uncovered, and this is further demonstrated by a comprehensive empirical study for a wide range of forward and inverse modelling problems spanning multiple physics domains. sf-PINNs can significantly improve the model accuracy by a few orders of magnitudes. Our paper also provides a theoretical foundation for unifying and explaining purported benefits from previously published empirical works, thereby advancing our understanding of this new PINN paradigm. © 2024 Institute of Electrical and Electronics Engineers Inc.. All rights reserved.
引用
收藏
页码:985 / 1000
页数:15
相关论文
共 50 条
  • [31] Improving the Efficiency of Training Physics-Informed Neural Networks Using Active Learning
    Aikawa, Yuri
    Ueda, Naonori
    Tanaka, Toshiyuki
    NEW GENERATION COMPUTING, 2024, 42 (04) : 739 - 760
  • [32] Learning Physics-Informed Neural Networks without Stacked Back-propagation
    He, Di
    Li, Shanda
    Shi, Wenlei
    Gao, Xiaotian
    Zhang, Jia
    Bian, Jiang
    Wang, Liwei
    Liu, Tie-Yan
    INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 206, 2023, 206
  • [33] Ensemble Learning with Physics-Informed Neural Networks for Harsh Time Series Analysis
    Kayisu, Antoine Kazadi
    Fasouli, Paraskevi
    Kambale, Witesyavwirwa Vianney
    Bokoro, Pitshou
    Kyamakya, Kyandoghere
    ADVANCES IN REAL-TIME AND AUTONOMOUS SYSTEMS, 2023, 2024, 1009 : 110 - 121
  • [34] A novel meta-learning initialization method for physics-informed neural networks
    Liu, Xu
    Zhang, Xiaoya
    Peng, Wei
    Zhou, Weien
    Yao, Wen
    NEURAL COMPUTING & APPLICATIONS, 2022, 34 (17): : 14511 - 14534
  • [35] Physics-informed neural networks for learning the homogenized coefficients of multiscale elliptic equations
    Park, Jun Sur Richard
    Zhu, Xueyu
    JOURNAL OF COMPUTATIONAL PHYSICS, 2022, 467
  • [36] Combining physics-informed neural networks with the freezing mechanism for general Hamiltonian learning
    Castelano, Leonardo K.
    Cunha, Iann
    Luiz, Fabricio S.
    Napolitano, Reginaldo de Jesus
    Prado, Marcelo V. de Souza
    Fanchini, Felipe F.
    PHYSICAL REVIEW A, 2024, 110 (03)
  • [37] A novel meta-learning initialization method for physics-informed neural networks
    Xu Liu
    Xiaoya Zhang
    Wei Peng
    Weien Zhou
    Wen Yao
    Neural Computing and Applications, 2022, 34 : 14511 - 14534
  • [38] Parallel Physics-Informed Neural Networks with Bidirectional Balance
    Huang, Yuhao
    Xu, Jiarong
    Fang, Shaomei
    Zhu, Zupeng
    Jiang, Linfeng
    Liang, Xiaoxin
    6TH INTERNATIONAL CONFERENCE ON INNOVATION IN ARTIFICIAL INTELLIGENCE, ICIAI2022, 2022, : 23 - 30
  • [39] Tackling the curse of dimensionality with physics-informed neural networks
    Hu, Zheyuan
    Shukla, Khemraj
    Karniadakis, George Em
    Kawaguchi, Kenji
    NEURAL NETWORKS, 2024, 176
  • [40] Boussinesq equation solved by the physics-informed neural networks
    Ruozhou Gao
    Wei Hu
    Jinxi Fei
    Hongyu Wu
    Nonlinear Dynamics, 2023, 111 : 15279 - 15291