Error bounds for approximations with deep ReLU neural networks in Ws,p norms

被引:90
|
作者
Guehring, Ingo [1 ]
Kutyniok, Gitta [1 ,2 ,3 ]
Petersen, Philipp [4 ]
机构
[1] Tech Univ Berlin, Inst Math, Berlin, Germany
[2] Tech Univ Berlin, Dept Comp Sci & Elect Engn, Berlin, Germany
[3] Univ Tromso, Dept Phys & Technol, Tromso, Norway
[4] Univ Oxford, Math Inst, Oxford, England
关键词
Deep neural networks; approximation rates; Sobolev spaces; PDEs; curse of dimension; ALGORITHM; SMOOTH;
D O I
10.1142/S0219530519410021
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
We analyze to what extent deep Rectified Linear Unit (ReLU) neural networks can efficiently approximate Sobolev regular functions if the approximation error is measured with respect to weaker Sobolev norms. In this context, we first establish upper approximation bounds by ReLU neural networks for Sobolev regular functions by explicitly constructing the approximate ReLU neural networks. Then, we establish lower approximation bounds for the same type of function classes. A trade-off between the regularity used in the approximation norm and the complexity of the neural network can be observed in upper and lower bounds. Our results extend recent advances in the approximation theory of ReLU networks to the regime that is most relevant for applications in the numerical analysis of partial differential equations.
引用
收藏
页码:803 / 859
页数:57
相关论文
共 50 条
  • [1] Error bounds for approximations with deep ReLU networks
    Yarotsky, Dmitry
    NEURAL NETWORKS, 2017, 94 : 103 - 114
  • [2] On the Error Bounds for ReLU Neural Networks
    Katende, Ronald
    Kasumba, Henry
    Kakuba, Godwin
    Mango, John
    IAENG International Journal of Applied Mathematics, 2024, 54 (12) : 2602 - 2611
  • [3] Error Bounds for Approximations Using Multichannel Deep Convolutional Neural Networks with Downsampling
    Liu, Xinling
    Hou, Jingyao
    JOURNAL OF APPLIED MATHEMATICS, 2023, 2023
  • [4] New Error Bounds for Deep ReLU Networks Using Sparse Grids
    Montanelli, Hadrien
    Du, Qiang
    SIAM JOURNAL ON MATHEMATICS OF DATA SCIENCE, 2019, 1 (01): : 78 - 92
  • [5] Error bounds for deep ReLU networks using the Kolmogorov-Arnold superposition theorem
    Montanelli, Hadrien
    Yang, Haizhao
    NEURAL NETWORKS, 2020, 129 : 1 - 6
  • [6] Tight Bounds on the Smallest Eigenvalue of the Neural Tangent Kernel for Deep ReLU Networks
    Nguyen, Quynh
    Mondelli, Marco
    Montufar, Guido
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 139, 2021, 139
  • [7] Error bounds for ReLU networks with depth and width parameters
    Kang, Jae-Mo
    Moon, Sunghwan
    JAPAN JOURNAL OF INDUSTRIAL AND APPLIED MATHEMATICS, 2023, 40 (01) : 275 - 288
  • [8] Error bounds for ReLU networks with depth and width parameters
    Jae-Mo Kang
    Sunghwan Moon
    Japan Journal of Industrial and Applied Mathematics, 2023, 40 : 275 - 288
  • [9] Towards Lower Bounds on the Depth of ReLU Neural Networks
    Hertrich, Christoph
    Basu, Amitabh
    Di Summa, Marco
    Skutella, Martin
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021,
  • [10] Generalization Error Bounds of Gradient Descent for Learning Over-Parameterized Deep ReLU Networks
    Cao, Yuan
    Gu, Quanquan
    THIRTY-FOURTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THE THIRTY-SECOND INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE AND THE TENTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2020, 34 : 3349 - 3356