Approximating smooth functions by deep neural networks with sigmoid activation function

被引:44
|
作者
Langer, Sophie [1 ]
机构
[1] Tech Univ Darmstadt, Fachbereich Math, Schlossgartenstr 7, D-64289 Darmstadt, Germany
关键词
Deep learning; Full connectivity; Neural networks; Uniform approximation;
D O I
10.1016/j.jmva.2020.104696
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
We study the power of deep neural networks (DNNs) with sigmoid activation function. Recently, it was shown that DNNs approximate any d-dimensional, smooth function on a compact set with a rate of order W-p/d, where W is the number of nonzero weights in the network and p is the smoothness of the function. Unfortunately, these rates only hold for a special class of sparsely connected DNNs. We ask ourselves if we can show the same approximation rate for a simpler and more general class, i.e., DNNs which are only defined by its width and depth. In this article we show that DNNs with fixed depth and a width of order M-d achieve an approximation rate of M-2p. As a conclusion we quantitatively characterize the approximation power of DNNs in terms of the overall weights W-0 in the network and show an approximation rate of W-0(-p/d). This more general result finally helps us to understand which network topology guarantees a special target accuracy. (C) 2020 Elsevier Inc. All rights reserved.
引用
收藏
页数:21
相关论文
共 50 条
  • [31] Sigmoid and Beyond: Algebraic Activation Functions for Artificial Neural Networks Based on Solutions of a Riccati Equation
    Protonotarios, Nicholas E.
    Fokas, Athanassios S.
    Kastis, George A.
    Dikaios, Nikolaos
    IT PROFESSIONAL, 2022, 24 (05) : 30 - 36
  • [32] Optimal approximation of piecewise smooth functions using deep ReLU neural networks
    Petersen, Philipp
    Voigtlaender, Felix
    NEURAL NETWORKS, 2018, 108 : 296 - 330
  • [33] Simple Electromagnetic Analysis Against Activation Functions of Deep Neural Networks
    Takatoi, Go
    Sugawara, Takeshi
    Sakiyama, Kazuo
    Li, Yang
    APPLIED CRYPTOGRAPHY AND NETWORK SECURITY WORKSHOPS, ACNS 2020, 2020, 12418 : 181 - 197
  • [34] Smooth Maximum Unit: Smooth Activation Function for Deep Networks using Smoothing Maximum Technique
    Biswas, Koushik
    Kumar, Sandeep
    Banerjee, Shilpak
    Pandey, Ashish Kumar
    2022 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR 2022), 2022, : 784 - 793
  • [35] A fast learning algorithm of neural networks for approximating function
    Zhu, JB
    Ma, SL
    JOURNAL OF INFRARED AND MILLIMETER WAVES, 1998, 17 (04) : 303 - 307
  • [36] Fast learning algorithm of neural networks for approximating function
    Zhu, Jubo
    Ma, Shiling
    Hongwai Yu Haomibo Xuebao/Journal of Infrared and Millimeter Waves, 1998, 17 (04): : 303 - 307
  • [37] Complexity of Gaussian-radial-basis networks approximating smooth functions
    Kainen, Paul C.
    Kurkova, Vera
    Sanguineti, Marcello
    JOURNAL OF COMPLEXITY, 2009, 25 (01) : 63 - 74
  • [38] Realization of the sigmoid activation function for neural networks on current FPGAs by the table-driven method
    V. Ushenina, Inna
    VESTNIK TOMSKOGO GOSUDARSTVENNOGO UNIVERSITETA-UPRAVLENIE VYCHISLITELNAJA TEHNIKA I INFORMATIKA-TOMSK STATE UNIVERSITY JOURNAL OF CONTROL AND COMPUTER SCIENCE, 2024, (69):
  • [39] 10 GHz Optoelectronic Circuit With Variable Sigmoid Shaped Activation Function for Photonic Neural Networks
    Chrysostomidis, Themistoklis
    Roumpos, Ioannis
    Moralis-Pegios, Miltiadis
    Lambrecht, Joris
    CailLaud, Christophe
    Yin, Xin
    Pleros, Nikos
    Vyrsokinos, Konstantinos
    JOURNAL OF LIGHTWAVE TECHNOLOGY, 2024, 42 (22) : 7977 - 7988
  • [40] Size and Depth Separation in Approximating Benign Functions with Neural Networks
    Vardi, Gal
    Reichman, Daniel
    Pitassi, Toniann
    Shamir, Ohad
    CONFERENCE ON LEARNING THEORY, VOL 134, 2021, 134