Depth-Width Tradeoffs in Approximating Natural Functions with Neural Networks

被引:0
|
作者
Safran, Itay [1 ]
Shamir, Ohad [1 ]
机构
[1] Weizmann Inst Sci, Rehovot, Israel
基金
以色列科学基金会;
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
We provide several new depth-based separation results for feed-forward neural networks, proving that various types of simple and natural functions can be better approximated using deeper networks than shallower ones, even if the shallower networks are much larger. This includes indicators of balls and ellipses; non-linear functions which are radial with respect to the L-1 norm; and smooth non-linear functions. We also show that these gaps can be observed experimentally: Increasing the depth indeed allows better learning than increasing width, when training neural networks to learn an indicator of a unit ball.
引用
收藏
页数:9
相关论文
共 50 条
  • [1] Fast and Accurate Binary Neural Networks based on Depth-Width Reshaping
    Xue, Ping
    Lu, Yang
    Chang, Jingfei
    Wei, Xing
    Wei, Zhen
    THIRTY-SEVENTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, VOL 37 NO 9, 2023, : 10684 - 10692
  • [2] Depth of Field Affects Perceived Depth-width Ratios in Photographs of Natural Scenes
    Nefs, Harold T.
    SEEING AND PERCEIVING, 2012, 25 (06): : 577 - 595
  • [3] Better Depth-Width Trade-offs for Neural Networks through the lens of Dynamical Systems
    Chatziafratis, Vaggos
    Nagarajan, Sai Ganesh
    Panageas, Ioannis
    25TH AMERICAS CONFERENCE ON INFORMATION SYSTEMS (AMCIS 2019), 2019,
  • [4] Better Depth-Width Trade-offs for Neural Networks through the lens of Dynamical Systems
    Chatziafratis, Vaggos
    Nagarajan, Sai Ganesh
    Panageas, Ioannis
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 119, 2020, 119
  • [5] Size and Depth Separation in Approximating Benign Functions with Neural Networks
    Vardi, Gal
    Reichman, Daniel
    Pitassi, Toniann
    Shamir, Ohad
    CONFERENCE ON LEARNING THEORY, VOL 134, 2021, 134
  • [6] Dynamic depth-width optimization for capsule graph convolutional network
    Shangwei WU
    Yingtong XIONG
    Chuliang WENG
    Frontiers of Computer Science, 2023, 17 (06) : 156 - 158
  • [7] Dynamic depth-width optimization for capsule graph convolutional network
    Wu, Shangwei
    Xiong, Yingtong
    Weng, Chuliang
    FRONTIERS OF COMPUTER SCIENCE, 2023, 17 (06)
  • [8] Approximating Lipschitz continuous functions with GroupSort neural networks
    Tanielian, U.
    Sangnier, M.
    Biau, G.
    24TH INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS (AISTATS), 2021, 130 : 442 - +
  • [9] Commutative Scaling of Width and Depth in Deep Neural Networks
    Hayou, Soufiane
    JOURNAL OF MACHINE LEARNING RESEARCH, 2024, 25
  • [10] Approximating functions by neural networks: A constructive solution in the uniform norm
    Meltser, M
    Shoham, M
    Manevitz, LM
    NEURAL NETWORKS, 1996, 9 (06) : 965 - 978