Error bounds for approximations with deep ReLU neural networks in Ws,p norms

被引:90
|
作者
Guehring, Ingo [1 ]
Kutyniok, Gitta [1 ,2 ,3 ]
Petersen, Philipp [4 ]
机构
[1] Tech Univ Berlin, Inst Math, Berlin, Germany
[2] Tech Univ Berlin, Dept Comp Sci & Elect Engn, Berlin, Germany
[3] Univ Tromso, Dept Phys & Technol, Tromso, Norway
[4] Univ Oxford, Math Inst, Oxford, England
关键词
Deep neural networks; approximation rates; Sobolev spaces; PDEs; curse of dimension; ALGORITHM; SMOOTH;
D O I
10.1142/S0219530519410021
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
We analyze to what extent deep Rectified Linear Unit (ReLU) neural networks can efficiently approximate Sobolev regular functions if the approximation error is measured with respect to weaker Sobolev norms. In this context, we first establish upper approximation bounds by ReLU neural networks for Sobolev regular functions by explicitly constructing the approximate ReLU neural networks. Then, we establish lower approximation bounds for the same type of function classes. A trade-off between the regularity used in the approximation norm and the complexity of the neural network can be observed in upper and lower bounds. Our results extend recent advances in the approximation theory of ReLU networks to the regime that is most relevant for applications in the numerical analysis of partial differential equations.
引用
收藏
页码:803 / 859
页数:57
相关论文
共 50 条
  • [31] Optimal approximation of piecewise smooth functions using deep ReLU neural networks
    Petersen, Philipp
    Voigtlaender, Felix
    NEURAL NETWORKS, 2018, 108 : 296 - 330
  • [32] Optimal Approximation Rates for Deep ReLU Neural Networks on Sobolev and Besov Spaces
    Siegel, Jonathan W.
    JOURNAL OF MACHINE LEARNING RESEARCH, 2023, 24
  • [33] DISCUSSION OF: "NONPARAMETRIC REGRESSION USING DEEP NEURAL NETWORKS WITH RELU ACTIVATION FUNCTION"
    Shamir, Ohad
    ANNALS OF STATISTICS, 2020, 48 (04): : 1911 - 1915
  • [34] Generalization Bounds of Deep Neural Networks With τ -Mixing Samples
    Liu, Liyuan
    Chen, Yaohui
    Li, Weifu
    Wang, Yingjie
    Gu, Bin
    Zheng, Feng
    Chen, Hong
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2025,
  • [35] Factor Augmented Sparse Throughput Deep ReLU Neural Networks for High Dimensional Regression
    Fan, Jianqing
    Gu, Yihong
    JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION, 2024, 119 (548) : 2680 - 2694
  • [36] Collocation approximation by deep neural ReLU networks for parametric and stochastic PDEs with lognormal inputs
    Dinh Dung
    SBORNIK MATHEMATICS, 2023, 214 (04) : 479 - 515
  • [37] Reachability Analysis of Deep ReLU Neural Networks using Facet-Vertex Incidence
    Yang, Xiaodong
    Johnson, Taylor T.
    Hoang-Dung Tran
    Yamaguchi, Tomoya
    Hoxha, Bardh
    Prokhorov, Danil
    HSCC2021: PROCEEDINGS OF THE 24TH INTERNATIONAL CONFERENCE ON HYBRID SYSTEMS: COMPUTATION AND CONTROL (PART OF CPS-IOT WEEK), 2021,
  • [38] ANALYTIC ERROR-BOUNDS FOR APPROXIMATIONS OF QUEUING-NETWORKS WITH AN APPLICATION TO ALTERNATE ROUTING
    VANDIJK, NM
    JOURNAL OF THE AUSTRALIAN MATHEMATICAL SOCIETY SERIES B-APPLIED MATHEMATICS, 1990, 31 : 241 - 258
  • [39] Approximations with deep neural networks in Sobolev time-space
    Abdeljawad, Ahmed
    Grohs, Philipp
    ANALYSIS AND APPLICATIONS, 2022, 20 (03) : 499 - 541
  • [40] Invariance Learning in Deep Neural Networks with Differentiable Laplace Approximations
    Immer, Alexander
    van der Ouderaa, Tycho F. A.
    Ratsch, Gunnar
    Fortuin, Vincent
    van der Wilk, Mark
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35 (NEURIPS 2022), 2022,