Error bounds for approximations with deep ReLU neural networks in Ws,p norms

被引:90
|
作者
Guehring, Ingo [1 ]
Kutyniok, Gitta [1 ,2 ,3 ]
Petersen, Philipp [4 ]
机构
[1] Tech Univ Berlin, Inst Math, Berlin, Germany
[2] Tech Univ Berlin, Dept Comp Sci & Elect Engn, Berlin, Germany
[3] Univ Tromso, Dept Phys & Technol, Tromso, Norway
[4] Univ Oxford, Math Inst, Oxford, England
关键词
Deep neural networks; approximation rates; Sobolev spaces; PDEs; curse of dimension; ALGORITHM; SMOOTH;
D O I
10.1142/S0219530519410021
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
We analyze to what extent deep Rectified Linear Unit (ReLU) neural networks can efficiently approximate Sobolev regular functions if the approximation error is measured with respect to weaker Sobolev norms. In this context, we first establish upper approximation bounds by ReLU neural networks for Sobolev regular functions by explicitly constructing the approximate ReLU neural networks. Then, we establish lower approximation bounds for the same type of function classes. A trade-off between the regularity used in the approximation norm and the complexity of the neural network can be observed in upper and lower bounds. Our results extend recent advances in the approximation theory of ReLU networks to the regime that is most relevant for applications in the numerical analysis of partial differential equations.
引用
收藏
页码:803 / 859
页数:57
相关论文
共 50 条
  • [21] Robust nonparametric regression based on deep ReLU neural networks
    Chen, Juntong
    JOURNAL OF STATISTICAL PLANNING AND INFERENCE, 2024, 233
  • [22] On Centralization and Unitization of Batch Normalization for Deep ReLU Neural Networks
    Fei, Wen
    Dai, Wenrui
    Li, Chenglin
    Zou, Junni
    Xiong, Hongkai
    IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2024, 72 : 2827 - 2841
  • [23] Deep ReLU neural networks in high-dimensional approximation
    Dung, Dinh
    Nguyen, Van Kien
    NEURAL NETWORKS, 2021, 142 : 619 - 635
  • [24] ReLU deep neural networks from the hierarchical basis perspective
    He, Juncai
    Li, Lin
    Xu, Jinchao
    COMPUTERS & MATHEMATICS WITH APPLICATIONS, 2022, 120 : 105 - 114
  • [25] Error bounds for maxout neural network approximations of model predictive control
    Teichrib, Dieter
    Darup, Moritz Schulze
    IFAC PAPERSONLINE, 2023, 56 (02): : 10113 - 10119
  • [26] On the CVP for the root lattices via folding with deep ReLU neural networks
    Corlay, Vincent
    Boutros, Joseph J.
    Ciblat, Philippe
    Brunel, Loic
    2019 IEEE INTERNATIONAL SYMPOSIUM ON INFORMATION THEORY (ISIT), 2019, : 1622 - 1626
  • [27] NONPARAMETRIC REGRESSION USING DEEP NEURAL NETWORKS WITH RELU ACTIVATION FUNCTION
    Schmidt-Hieber, Johannes
    ANNALS OF STATISTICS, 2020, 48 (04): : 1875 - 1897
  • [28] Approximation in shift-invariant spaces with deep ReLU neural networks
    Yang, Yunfei
    Li, Zhen
    Wang, Yang
    NEURAL NETWORKS, 2022, 153 : 269 - 281
  • [29] Deep ReLU neural networks overcome the curse of dimensionality for partial integrodifferential equations
    Gonon, Lukas
    Schwab, Christoph
    ANALYSIS AND APPLICATIONS, 2023, 21 (01) : 1 - 47
  • [30] Provable Accelerated Convergence of Nesterov's Momentum for Deep ReLU Neural Networks
    Liao, Fangshuo
    Kyrillidis, Anastasios
    INTERNATIONAL CONFERENCE ON ALGORITHMIC LEARNING THEORY, VOL 237, 2024, 237