Neural network with unbounded activation functions is universal approximator

被引:171
|
作者
Sonoda, Sho [1 ]
Murata, Noboru [1 ]
机构
[1] Waseda Univ, Fac Sci & Engn, Shinjuku Ku, 3-4-1 Okubo, Tokyo 1698555, Japan
关键词
Neural network; Integral representation; Rectified linear unit (ReLU); Universal approximation; Ridgelet transform; Admissibility condition; Lizorkin distribution; Radon transform; Backprojection filter; Bounded extension to L-2; TRANSFORM; REPRESENTATION; SUPERPOSITIONS; RATES;
D O I
10.1016/j.acha.2015.12.005
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
This paper presents an investigation of the approximation property of neural networks with unbounded activation functions, such as the rectified linear unit (ReLU), which is the new de-facto standard of deep learning. The ReLU network can be analyzed by the ridgelet transform with respect to Lizorkin distributions. By showing three reconstruction formulas by using the Fourier slice theorem, the Radon transform, and Parseval's relation, it is shown that a neural network with unbounded activation functions still satisfies the universal approximation property. As an additional consequence, the ridgelet transform, or the backprojection filter in the Radon domain, is what the network learns after backpropagation. Subject to a constructive admissibility condition, the trained network can be obtained by simply discretizing the ridgelet transform, without backpropagation. Numerical examples not only support the consistency of the admissibility condition but also imply that some non-admissible cases result in low-pass filtering. (C) 2015 Elsevier Inc. All rights reserved.
引用
收藏
页码:233 / 268
页数:36
相关论文
共 50 条
  • [41] Neuroevolutionary based convolutional neural network with adaptive activation functions
    ZahediNasab, Roxana
    Mohseni, Hadis
    NEUROCOMPUTING, 2020, 381 : 306 - 313
  • [42] The impact of activation functions on training and performance of a deep neural network
    Marcu, David C.
    Grava, Cristian
    2021 16TH INTERNATIONAL CONFERENCE ON ENGINEERING OF MODERN ELECTRIC SYSTEMS (EMES), 2021, : 126 - 129
  • [43] All-optical neural network with nonlinear activation functions
    Zuo, Ying
    Li, Bohan
    Zhao, Yujun
    Jiang, Yue
    Chen, You-Chiuan
    Chen, Peng
    Jo, Gyu-Boong
    Liu, Junwei
    Du, Shengwang
    OPTICA, 2019, 6 (09): : 1132 - 1137
  • [44] On Neural Network Activation Functions and Optimizers in Relation to Polynomial Regression
    Pomerat, John
    Segev, Aviv
    Datta, Rituparna
    2019 IEEE INTERNATIONAL CONFERENCE ON BIG DATA (BIG DATA), 2019, : 6183 - 6185
  • [45] New activation functions for single layer feedforward neural network
    Kocak, Yilmaz
    Siray, Gulesen Ustundag
    EXPERT SYSTEMS WITH APPLICATIONS, 2021, 164
  • [46] UNIVERSAL FORMULA FOR NETWORK FUNCTIONS
    SKELBOE, S
    IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS, 1975, CA22 (01): : 58 - 60
  • [47] Generalized fuzzy hyperbolic model: A universal approximator
    Sch. of Info. Sci. and Eng., Northeastern Univ., Shenyang 110004, China
    Dongbei Daxue Xuebao/Journal of Northeastern University, 2003, 24 (01): : 1 - 3
  • [48] A Novel Fuzzy Neural Network Approximator with Exponential Fast Terminal Sliding Mode
    He, Ming
    Liu, Yunfeng
    Liu, GuangBin
    Liu, Huafeng
    2008 7TH WORLD CONGRESS ON INTELLIGENT CONTROL AND AUTOMATION, VOLS 1-23, 2008, : 4736 - 4740
  • [49] Multistability of recurrent neural networks with general periodic activation functions and unbounded time-varying delays
    Wang, Jiarui
    Zhu, Song
    Ma, Qingyang
    Mu, Chaoxu
    Liu, Xiaoyang
    Wen, Shiping
    JOURNAL OF THE FRANKLIN INSTITUTE-ENGINEERING AND APPLIED MATHEMATICS, 2024, 361 (18):
  • [50] Unbounded Fuzzy Hypersphere Neural Network Classifier
    Mahindrakar M.S.
    Kulkarni U.V.
    Journal of The Institution of Engineers (India): Series B, 2022, 103 (04) : 1335 - 1343