Neural network with unbounded activation functions is universal approximator

被引:171
|
作者
Sonoda, Sho [1 ]
Murata, Noboru [1 ]
机构
[1] Waseda Univ, Fac Sci & Engn, Shinjuku Ku, 3-4-1 Okubo, Tokyo 1698555, Japan
关键词
Neural network; Integral representation; Rectified linear unit (ReLU); Universal approximation; Ridgelet transform; Admissibility condition; Lizorkin distribution; Radon transform; Backprojection filter; Bounded extension to L-2; TRANSFORM; REPRESENTATION; SUPERPOSITIONS; RATES;
D O I
10.1016/j.acha.2015.12.005
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
This paper presents an investigation of the approximation property of neural networks with unbounded activation functions, such as the rectified linear unit (ReLU), which is the new de-facto standard of deep learning. The ReLU network can be analyzed by the ridgelet transform with respect to Lizorkin distributions. By showing three reconstruction formulas by using the Fourier slice theorem, the Radon transform, and Parseval's relation, it is shown that a neural network with unbounded activation functions still satisfies the universal approximation property. As an additional consequence, the ridgelet transform, or the backprojection filter in the Radon domain, is what the network learns after backpropagation. Subject to a constructive admissibility condition, the trained network can be obtained by simply discretizing the ridgelet transform, without backpropagation. Numerical examples not only support the consistency of the admissibility condition but also imply that some non-admissible cases result in low-pass filtering. (C) 2015 Elsevier Inc. All rights reserved.
引用
收藏
页码:233 / 268
页数:36
相关论文
共 50 条
  • [21] ENN: A Neural Network With DCT Adaptive Activation Functions
    Martinez-Gost, Marc
    Perez-Neira, Ana
    Lagunas, Miguel Angel
    IEEE JOURNAL OF SELECTED TOPICS IN SIGNAL PROCESSING, 2024, 18 (02) : 232 - 241
  • [22] Unification of popular artificial neural network activation functions
    Mostafanejad, Mohammad
    FRACTIONAL CALCULUS AND APPLIED ANALYSIS, 2024, 27 (06) : 3504 - 3526
  • [23] IMPLEMENTING NONLINEAR ACTIVATION FUNCTIONS IN NEURAL NETWORK EMULATORS
    SAMMUT, KM
    JONES, SR
    ELECTRONICS LETTERS, 1991, 27 (12) : 1037 - 1038
  • [24] Multivariate neural network operators with sigmoidal activation functions
    Costarelli, Danilo
    Spigler, Renato
    NEURAL NETWORKS, 2013, 48 : 72 - 77
  • [25] Deep Neural Network Using Trainable Activation Functions
    Chung, Hoon
    Lee, Sung Joo
    Park, Jeon Gue
    2016 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2016, : 348 - 352
  • [26] Neural network as a function approximator and its application in solving differential equations
    Zeyu Liu
    Yantao Yang
    Qingdong Cai
    Applied Mathematics and Mechanics, 2019, 40 : 237 - 248
  • [27] Neural network as a function approximator and its application in solving differential equations
    Liu, Zeyu
    Yang, Yantao
    Cai, Qingdong
    APPLIED MATHEMATICS AND MECHANICS-ENGLISH EDITION, 2019, 40 (02) : 237 - 248
  • [28] Neural network as a function approximator and its application in solving differential equations
    Zeyu LIU
    Yantao YANG
    Qingdong CAI
    AppliedMathematicsandMechanics(EnglishEdition), 2019, 40 (02) : 237 - 248
  • [29] Global Robust Exponential Stability for Interval Delayed Neural Networks with Possibly Unbounded Activation Functions
    Sitian Qin
    Dejun Fan
    Ming Yan
    Qinghe Liu
    Neural Processing Letters, 2014, 40 : 35 - 50
  • [30] Multistability of Recurrent Neural Networks With Nonmonotonic Activation Functions and Unbounded Time-Varying Delays
    Liu, Peng
    Zeng, Zhigang
    Wang, Jun
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2018, 29 (07) : 3000 - 3010