Approximating smooth functions by deep neural networks with sigmoid activation function

被引:44
|
作者
Langer, Sophie [1 ]
机构
[1] Tech Univ Darmstadt, Fachbereich Math, Schlossgartenstr 7, D-64289 Darmstadt, Germany
关键词
Deep learning; Full connectivity; Neural networks; Uniform approximation;
D O I
10.1016/j.jmva.2020.104696
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
We study the power of deep neural networks (DNNs) with sigmoid activation function. Recently, it was shown that DNNs approximate any d-dimensional, smooth function on a compact set with a rate of order W-p/d, where W is the number of nonzero weights in the network and p is the smoothness of the function. Unfortunately, these rates only hold for a special class of sparsely connected DNNs. We ask ourselves if we can show the same approximation rate for a simpler and more general class, i.e., DNNs which are only defined by its width and depth. In this article we show that DNNs with fixed depth and a width of order M-d achieve an approximation rate of M-2p. As a conclusion we quantitatively characterize the approximation power of DNNs in terms of the overall weights W-0 in the network and show an approximation rate of W-0(-p/d). This more general result finally helps us to understand which network topology guarantees a special target accuracy. (C) 2020 Elsevier Inc. All rights reserved.
引用
收藏
页数:21
相关论文
共 50 条
  • [21] Effective Activation Functions for Homomorphic Evaluation of Deep Neural Networks
    Obla, Srinath
    Gong, Xinghan
    Aloufi, Asma
    Hu, Peizhao
    Takabi, Daniel
    IEEE ACCESS, 2020, 8 (08): : 153098 - 153112
  • [22] Activation Functions of Deep Neural Networks for Polar Decoding Applications
    Seo, Jihoon
    Lee, Juyul
    Kim, Keunyoung
    2017 IEEE 28TH ANNUAL INTERNATIONAL SYMPOSIUM ON PERSONAL, INDOOR, AND MOBILE RADIO COMMUNICATIONS (PIMRC), 2017,
  • [23] Approximating Lipschitz continuous functions with GroupSort neural networks
    Tanielian, U.
    Sangnier, M.
    Biau, G.
    24TH INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS (AISTATS), 2021, 130 : 442 - +
  • [24] Voltage-to-Voltage Sigmoid Neuron Activation Function Design for Artificial Neural Networks
    Moposita, Tatiana
    Trojman, Lionel
    Crupi, Felice
    Lanuzza, Marco
    Vladimirescu, Andrei
    2022 IEEE 13TH LATIN AMERICAN SYMPOSIUM ON CIRCUITS AND SYSTEMS (LASCAS), 2022, : 164 - 167
  • [25] SIGMOID TRANSFER-FUNCTIONS IN BACKPROPAGATION NEURAL NETWORKS
    HARRINGTON, PD
    ANALYTICAL CHEMISTRY, 1993, 65 (15) : 2167 - 2168
  • [26] An Efficient Asymmetric Nonlinear Activation Function for Deep Neural Networks
    Chai, Enhui
    Yu, Wei
    Cui, Tianxiang
    Ren, Jianfeng
    Ding, Shusheng
    SYMMETRY-BASEL, 2022, 14 (05):
  • [27] Characterization of a class of sigmoid functions with applications to neural networks
    Menon, A
    Mehrotra, K
    Mohan, CK
    Ranka, S
    NEURAL NETWORKS, 1996, 9 (05) : 819 - 835
  • [28] Regularized Flexible Activation Function Combination for Deep Neural Networks
    Jie, Renlong
    Gao, Junbin
    Vasnev, Andrey
    Tran, Minh-ngoc
    2020 25TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION (ICPR), 2021, : 2001 - 2008
  • [29] NIPUNA: A Novel Optimizer Activation Function for Deep Neural Networks
    Madhu, Golla
    Kautish, Sandeep
    Alnowibet, Khalid Abdulaziz
    Zawbaa, Hossam M. M.
    Mohamed, Ali Wagdy
    AXIOMS, 2023, 12 (03)
  • [30] Universal Approximation of Nonlinear System Predictions in Sigmoid Activation Functions Using Artificial Neural Networks
    Murugadoss, R.
    Ramakrishnan, M.
    2014 IEEE INTERNATIONAL CONFERENCE ON COMPUTATIONAL INTELLIGENCE AND COMPUTING RESEARCH (IEEE ICCIC), 2014, : 1062 - 1067