Tackling the curse of dimensionality with physics-informed neural networks

被引:22
|
作者
Hu, Zheyuan [1 ]
Shukla, Khemraj [2 ]
Karniadakis, George Em [2 ]
Kawaguchi, Kenji [1 ]
机构
[1] Natl Univ Singapore, 21 Lower Kent Ridge Rd, Singapore 119077, Singapore
[2] Brown Univ, Div Appl Math, 182 George St, Providence, RI 02912 USA
关键词
Physics-informed neural networks; Curse of dimensionality; PARTIAL-DIFFERENTIAL-EQUATIONS; DEEP LEARNING FRAMEWORK; ALGORITHMS; XPINNS;
D O I
10.1016/j.neunet.2024.106369
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The curse -of -dimensionality taxes computational resources heavily with exponentially increasing computational cost as the dimension increases. This poses great challenges in solving high -dimensional partial differential equations (PDEs), as Richard E. Bellman first pointed out over 60 years ago. While there has been some recent success in solving numerical PDEs in high dimensions, such computations are prohibitively expensive, and true scaling of general nonlinear PDEs to high dimensions has never been achieved. We develop a new method of scaling up physics -informed neural networks (PINNs) to solve arbitrary high -dimensional PDEs. The new method, called Stochastic Dimension Gradient Descent (SDGD), decomposes a gradient of PDEs' and PINNs' residual into pieces corresponding to different dimensions and randomly samples a subset of these dimensional pieces in each iteration of training PINNs. We prove theoretically the convergence and other desired properties of the proposed method. We demonstrate in various diverse tests that the proposed method can solve many notoriously hard high -dimensional PDEs, including the Hamilton-Jacobi-Bellman (HJB) and the Schr & ouml;dinger equations in tens of thousands of dimensions very fast on a single GPU using the PINNs mesh -free approach. Notably, we solve nonlinear PDEs with nontrivial, anisotropic, and inseparable solutions in less than one hour for 1000 dimensions and in 12 h for 100,000 dimensions on a single GPU using SDGD with PINNs. Since SDGD is a general training methodology of PINNs, it can be applied to any current and future variants of PINNs to scale them up for arbitrary high -dimensional PDEs.
引用
收藏
页数:21
相关论文
共 50 条
  • [41] Self-Adaptive Physics-Informed Neural Networks
    Texas A&M University, United States
    1600,
  • [42] Temporal consistency loss for physics-informed neural networks
    Thakur, Sukirt
    Raissi, Maziar
    Mitra, Harsa
    Ardekani, Arezoo M.
    PHYSICS OF FLUIDS, 2024, 36 (07)
  • [43] Discontinuity Computing Using Physics-Informed Neural Networks
    Liu, Li
    Liu, Shengping
    Xie, Hui
    Xiong, Fansheng
    Yu, Tengchao
    Xiao, Mengjuan
    Liu, Lufeng
    Yong, Heng
    JOURNAL OF SCIENTIFIC COMPUTING, 2024, 98 (01)
  • [44] Adaptive task decomposition physics-informed neural networks
    Yang, Jianchuan
    Liu, Xuanqi
    Diao, Yu
    Chen, Xi
    Hu, Haikuo
    COMPUTER METHODS IN APPLIED MECHANICS AND ENGINEERING, 2024, 418
  • [45] Physics-informed neural networks for modeling astrophysical shocks
    Moschou, S. P.
    Hicks, E.
    Parekh, R. Y.
    Mathew, D.
    Majumdar, S.
    Vlahakis, N.
    MACHINE LEARNING-SCIENCE AND TECHNOLOGY, 2023, 4 (03):
  • [46] Physics-Informed Neural Networks with skip connections for modeling and
    Kittelsen, Jonas Ekeland
    Antonelo, Eric Aislan
    Camponogara, Eduardo
    Imsland, Lars Struen
    APPLIED SOFT COMPUTING, 2024, 158
  • [47] Scalable algorithms for physics-informed neural and graph networks
    Shukla, Khemraj
    Xu, Mengjia
    Trask, Nathaniel
    Karniadakis, George E.
    DATA-CENTRIC ENGINEERING, 2022, 3
  • [48] General implementation of quantum physics-informed neural networks
    Vadyala, Shashank Reddy
    Betgeri, Sai Nethra
    ARRAY, 2023, 18
  • [49] Solving the Teukolsky equation with physics-informed neural networks
    Luna, Raimon
    Bustillo, Juan Calderon
    Martinez, Juan Jose Seoane
    Torres-Forne, Alejandro
    Font, Jose A.
    PHYSICAL REVIEW D, 2023, 107 (06)
  • [50] Spiking Physics-Informed Neural Networks on Loihi 2
    Theilman, Bradley H.
    Zhang, Qian
    Kahana, Adar
    Cyr, Eric C.
    Trask, Nathaniel
    Aimone, James B.
    Karniadakis, George Em
    2024 NEURO INSPIRED COMPUTATIONAL ELEMENTS CONFERENCE, NICE, 2024,