Neural Networks and Rational Functions

被引:0
|
作者
Telgarsky, Matus [1 ,2 ]
机构
[1] Univ Illinois, Urbana, IL 61801 USA
[2] Simons Inst, Berkeley, CA USA
关键词
CIRCUITS;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Neural networks and rational functions efficiently approximate each other. In more detail, it is shown here that for any ReLU network, there exists a rational function of degree O (poly log(1/epsilon) ) which is 6-close, and similarly for any rational function there exists a ReLU network of size O (poly log(1/epsilon) ) which is epsilon-close. By contrast, polynomials need degree Omega (poly (1 /epsilon) ) to approximate even a single ReLU. When converting a ReLU network to a rational function as above, the hidden constants depend exponentially on the number of layers, which is shown to be tight; in other words, a compositional representation can be beneficial even for rational functions.
引用
收藏
页数:7
相关论文
共 50 条
  • [41] Representation of intermolecular potential functions by neural networks
    Gassner, H
    Probst, M
    Lauenstein, A
    Hermansson, K
    JOURNAL OF PHYSICAL CHEMISTRY A, 1998, 102 (24): : 4596 - 4605
  • [42] Neural Redshift: Random Networks are not Random Functions
    Teney, Damien
    Nicolicioiu, Armand Mihai
    Hartmann, Valentin
    Abbasnejad, Ehsan
    2024 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION, CVPR 2024, 2024, : 4786 - 4796
  • [43] HOLDER CONTINUOUS ACTIVATION FUNCTIONS IN NEURAL NETWORKS
    Tatar, Nasser-Eddine
    ADVANCES IN DIFFERENTIAL EQUATIONS AND CONTROL PROCESSES, 2015, 15 (02): : 93 - 106
  • [44] The neural networks for logical functions with optical devices
    Degeratu, V
    Degeratu, S
    Schiopu, P
    7TH WORLD MULTICONFERENCE ON SYSTEMICS, CYBERNETICS AND INFORMATICS, VOL XI, PROCEEDINGS: COMMUNICATION, NETWORK AND CONTROL SYSTEMS, TECHNOLOGIES AND APPLICATIONS: II, 2003, : 290 - 293
  • [45] Decomposing neural networks as mappings of correlation functions
    Fischer, Kirsten
    Rene, Alexandre
    Keup, Christian
    Layer, Moritz
    Dahmen, David
    Helias, Moritz
    PHYSICAL REVIEW RESEARCH, 2022, 4 (04):
  • [46] Learning Activation Functions for Sparse Neural Networks
    Loni, Mohammad
    Mohan, Aditya
    Asadi, Mehdi
    Lindauer, Marius
    INTERNATIONAL CONFERENCE ON AUTOMATED MACHINE LEARNING, VOL 224, 2023, 224
  • [47] Quantum activation functions for quantum neural networks
    Maronese, Marco
    Destri, Claudio
    Prati, Enrico
    QUANTUM INFORMATION PROCESSING, 2022, 21 (04)
  • [48] On the approximation of rough functions with deep neural networks
    De Ryck T.
    Mishra S.
    Ray D.
    SeMA Journal, 2022, 79 (3) : 399 - 440
  • [49] Stochastic Neural Networks with Monotonic Activation Functions
    Ravanbakhsh, Siamak
    Poczos, Barnabas
    Schneider, Jeff
    Schuurmans, Dale
    Greiner, Russell
    ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 51, 2016, 51 : 809 - 818
  • [50] Interpretability of Neural Networks with Probability Density Functions
    Pan, Tingting
    Pedrycz, Witold
    Cui, Jiahui
    Yang, Jie
    Wu, Wei
    ADVANCED THEORY AND SIMULATIONS, 2022, 5 (03)