Approximating smooth functions by deep neural networks with sigmoid activation function

被引:44
|
作者
Langer, Sophie [1 ]
机构
[1] Tech Univ Darmstadt, Fachbereich Math, Schlossgartenstr 7, D-64289 Darmstadt, Germany
关键词
Deep learning; Full connectivity; Neural networks; Uniform approximation;
D O I
10.1016/j.jmva.2020.104696
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
We study the power of deep neural networks (DNNs) with sigmoid activation function. Recently, it was shown that DNNs approximate any d-dimensional, smooth function on a compact set with a rate of order W-p/d, where W is the number of nonzero weights in the network and p is the smoothness of the function. Unfortunately, these rates only hold for a special class of sparsely connected DNNs. We ask ourselves if we can show the same approximation rate for a simpler and more general class, i.e., DNNs which are only defined by its width and depth. In this article we show that DNNs with fixed depth and a width of order M-d achieve an approximation rate of M-2p. As a conclusion we quantitatively characterize the approximation power of DNNs in terms of the overall weights W-0 in the network and show an approximation rate of W-0(-p/d). This more general result finally helps us to understand which network topology guarantees a special target accuracy. (C) 2020 Elsevier Inc. All rights reserved.
引用
收藏
页数:21
相关论文
共 50 条
  • [41] THE SCALING PARAMETER OF THE SIGMOID FUNCTION IN ARTIFICIAL NEURAL NETWORKS
    VANDERHAGEN, THJJ
    NUCLEAR TECHNOLOGY, 1994, 106 (01) : 135 - 138
  • [42] Approximating functions by neural networks: A constructive solution in the uniform norm
    Meltser, M
    Shoham, M
    Manevitz, LM
    NEURAL NETWORKS, 1996, 9 (06) : 965 - 978
  • [43] TanhExp: A smooth activation function with high convergence speed for lightweight neural networks
    Liu, Xinyu
    Di, Xiaoguang
    IET COMPUTER VISION, 2021, 15 (02) : 136 - 150
  • [44] On stability of delayed cellular neural networks with sigmoid output functions
    Yaru Mo
    Xiaoping Xue
    Shiji Song
    Science in China Series F: Information Sciences, 2003, 46 : 371 - 380
  • [45] NONPARAMETRIC REGRESSION USING DEEP NEURAL NETWORKS WITH RELU ACTIVATION FUNCTION
    Schmidt-Hieber, Johannes
    ANNALS OF STATISTICS, 2020, 48 (04): : 1875 - 1897
  • [46] On stability of delayed cellular neural networks with sigmoid output functions
    莫亚如
    薛小平
    宋士吉
    Science in China(Series F:Information Sciences), 2003, (05) : 371 - 380
  • [47] On stability of delayed cellular neural networks with sigmoid output functions
    Mo, Y
    Xue, XP
    Song, SJ
    SCIENCE IN CHINA SERIES F-INFORMATION SCIENCES, 2003, 46 (05): : 371 - 380
  • [48] Stability of delayed hopfield neural networks with sigmoid output functions
    Mo, Y
    Zhang, B
    DYNAMIC SYSTEMS AND APPLICATIONS, 2005, 14 (3-4): : 569 - 577
  • [49] Improving the Performance of Deep Neural Networks Using Two Proposed Activation Functions
    Alkhouly, Asmaa A.
    Mohammed, Ammar
    Hefny, Hesham A.
    IEEE ACCESS, 2021, 9 : 82249 - 82271
  • [50] Adaptive Activation Functions for Skin Lesion Classification Using Deep Neural Networks
    Namozov, Abdulaziz
    Ergashev, Dilshod
    Cho, Young Im
    2018 JOINT 10TH INTERNATIONAL CONFERENCE ON SOFT COMPUTING AND INTELLIGENT SYSTEMS (SCIS) AND 19TH INTERNATIONAL SYMPOSIUM ON ADVANCED INTELLIGENT SYSTEMS (ISIS), 2018, : 232 - 235