Neural network with unbounded activation functions is universal approximator

被引:171
|
作者
Sonoda, Sho [1 ]
Murata, Noboru [1 ]
机构
[1] Waseda Univ, Fac Sci & Engn, Shinjuku Ku, 3-4-1 Okubo, Tokyo 1698555, Japan
关键词
Neural network; Integral representation; Rectified linear unit (ReLU); Universal approximation; Ridgelet transform; Admissibility condition; Lizorkin distribution; Radon transform; Backprojection filter; Bounded extension to L-2; TRANSFORM; REPRESENTATION; SUPERPOSITIONS; RATES;
D O I
10.1016/j.acha.2015.12.005
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
This paper presents an investigation of the approximation property of neural networks with unbounded activation functions, such as the rectified linear unit (ReLU), which is the new de-facto standard of deep learning. The ReLU network can be analyzed by the ridgelet transform with respect to Lizorkin distributions. By showing three reconstruction formulas by using the Fourier slice theorem, the Radon transform, and Parseval's relation, it is shown that a neural network with unbounded activation functions still satisfies the universal approximation property. As an additional consequence, the ridgelet transform, or the backprojection filter in the Radon domain, is what the network learns after backpropagation. Subject to a constructive admissibility condition, the trained network can be obtained by simply discretizing the ridgelet transform, without backpropagation. Numerical examples not only support the consistency of the admissibility condition but also imply that some non-admissible cases result in low-pass filtering. (C) 2015 Elsevier Inc. All rights reserved.
引用
收藏
页码:233 / 268
页数:36
相关论文
共 50 条
  • [1] A competitive functional link artificial neural network as a universal approximator
    Lotfi, Ehsan
    Rezaee, Abbas Ali
    SOFT COMPUTING, 2018, 22 (14) : 4613 - 4625
  • [2] A competitive functional link artificial neural network as a universal approximator
    Ehsan Lotfi
    Abbas Ali Rezaee
    Soft Computing, 2018, 22 : 4613 - 4625
  • [3] Implementation of Universal Neural Network Approximator on a ULP Microcontroller for Wavelet Synthesis in Electroencephalography
    Bogoslovskii, Ivan A.
    Ermolenko, Daniil V.
    Stepanov, Andrey B.
    Kilicheva, Klavdiya Kh.
    Pomogalova, Albina V.
    PROCEEDINGS OF THE 2019 IEEE CONFERENCE OF RUSSIAN YOUNG RESEARCHERS IN ELECTRICAL AND ELECTRONIC ENGINEERING (EICONRUS), 2019, : 1146 - 1151
  • [4] Estimating the Heat Transfer Coefficient using Universal Function Approximator Neural Network
    Szenasi, Sandor
    Felde, Imre
    Nagy, Gabor
    Deus, Augusto
    2018 IEEE 12TH INTERNATIONAL SYMPOSIUM ON APPLIED COMPUTATIONAL INTELLIGENCE AND INFORMATICS (SACI), 2018, : 401 - 404
  • [5] Comparison of ReLU and linear saturated activation functions in neural network for universal approximation
    Stursa, Dominik
    Dolezel, Petr
    PROCEEDINGS OF THE 2019 22ND INTERNATIONAL CONFERENCE ON PROCESS CONTROL (PC19), 2019, : 146 - 151
  • [6] The delayed response network: towards a single layer universal neural network approximator and delay-based learning
    Dinov Martin
    Elias Rut
    BMC Neuroscience, 16 (Suppl 1)
  • [7] Universal Functions and Unbounded Branching Trees
    A. N. Khisamiev
    Algebra and Logic, 2018, 57 : 309 - 319
  • [8] UNIVERSAL FUNCTIONS AND UNBOUNDED BRANCHING TREES
    Khisamiev, A. N.
    ALGEBRA AND LOGIC, 2018, 57 (04) : 309 - 319
  • [9] The Role of Neural Network Activation Functions
    Parhi, Rahul
    Nowak, Robert D.
    IEEE SIGNAL PROCESSING LETTERS, 2020, 27 (1779-1783) : 1779 - 1783
  • [10] Universal matrices and strongly unbounded functions
    Koszmider, P
    MATHEMATICAL RESEARCH LETTERS, 2002, 9 (04) : 549 - 566