Approximating smooth functions by deep neural networks with sigmoid activation function

被引:44
|
作者
Langer, Sophie [1 ]
机构
[1] Tech Univ Darmstadt, Fachbereich Math, Schlossgartenstr 7, D-64289 Darmstadt, Germany
关键词
Deep learning; Full connectivity; Neural networks; Uniform approximation;
D O I
10.1016/j.jmva.2020.104696
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
We study the power of deep neural networks (DNNs) with sigmoid activation function. Recently, it was shown that DNNs approximate any d-dimensional, smooth function on a compact set with a rate of order W-p/d, where W is the number of nonzero weights in the network and p is the smoothness of the function. Unfortunately, these rates only hold for a special class of sparsely connected DNNs. We ask ourselves if we can show the same approximation rate for a simpler and more general class, i.e., DNNs which are only defined by its width and depth. In this article we show that DNNs with fixed depth and a width of order M-d achieve an approximation rate of M-2p. As a conclusion we quantitatively characterize the approximation power of DNNs in terms of the overall weights W-0 in the network and show an approximation rate of W-0(-p/d). This more general result finally helps us to understand which network topology guarantees a special target accuracy. (C) 2020 Elsevier Inc. All rights reserved.
引用
收藏
页数:21
相关论文
共 50 条
  • [1] LEARNING AND APPROXIMATING PIECEWISE SMOOTH FUNCTIONS BY DEEP SIGMOID NEURAL NETWORKS
    Liu, Xia
    MATHEMATICAL FOUNDATIONS OF COMPUTING, 2025, 8 (01): : 74 - 88
  • [2] Smooth Function Approximation by Deep Neural Networks with General Activation Functions
    Ohn, Ilsang
    Kim, Yongdai
    ENTROPY, 2019, 21 (07)
  • [3] Approximating smooth and sparse functions by deep neural networks: Optimal approximation rates and saturation
    Liu, Xia
    JOURNAL OF COMPLEXITY, 2023, 79
  • [4] Activation Functions and Their Characteristics in Deep Neural Networks
    Ding, Bin
    Qian, Huimin
    Zhou, Jun
    PROCEEDINGS OF THE 30TH CHINESE CONTROL AND DECISION CONFERENCE (2018 CCDC), 2018, : 1836 - 1841
  • [5] Deep Neural Networks with Multistate Activation Functions
    Cai, Chenghao
    Xu, Yanyan
    Ke, Dengfeng
    Su, Kaile
    COMPUTATIONAL INTELLIGENCE AND NEUROSCIENCE, 2015, 2015
  • [6] Approximating functions with multi-features by deep convolutional neural networks
    Mao, Tong
    Shi, Zhongjie
    Zhou, Ding-Xuan
    ANALYSIS AND APPLICATIONS, 2023, 21 (01) : 93 - 125
  • [7] Theory of deep convolutional neural networks III: Approximating radial functions
    Mao, Tong
    Shi, Zhongjie
    Zhou, Ding-Xuan
    NEURAL NETWORKS, 2021, 144 : 778 - 790
  • [8] Universal Approximation Using Probabilistic Neural Networks with Sigmoid Activation Functions
    Murugadoss, R.
    Ramakrishnan, M.
    2014 INTERNATIONAL CONFERENCE ON ADVANCES IN ENGINEERING AND TECHNOLOGY RESEARCH (ICAETR), 2014,
  • [9] Weighted sigmoid gate unit for an activation function of deep neural network
    Tanaka, Masayuki
    PATTERN RECOGNITION LETTERS, 2020, 135 : 354 - 359
  • [10] A Formal Characterization of Activation Functions in Deep Neural Networks
    Amrouche, Massi
    Stipanovic, Dusan M.
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2024, 35 (02) : 2153 - 2166