Spurious Local Minima are Common in Two-Layer ReLU Neural Networks

被引:0
|
作者
Safran, Itay [1 ]
Shamir, Ohad [1 ]
机构
[1] Weizmann Inst Sci, Rehovot, Israel
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
We consider the optimization problem associated with training simple ReLU neural networks of the form x bar right arrow Sigma(k)(i=1)max{0, w(i)(inverted perpendicular) x} with respect to the squared loss. We provide a computer-assisted proof that even if the input distribution is standard Gaussian, even if the dimension is arbitrarily large, and even if the target values are generated by such a network, with orthonormal parameter vectors, the problem can still have spurious local minima once 6 <= k <= 20. By a concentration of measure argument, this implies that in high input dimensions, nearly all target networks of the relevant sizes lead to spurious local minima Moreover, we conduct experiments which show that the probability of hitting such local minima is quite high, and increasing with the network size. On the positive side, mild over-parameterization appears to drastically reduce such local minima, indicating that an over-parameterization assumption is necessary to get a positive result in this setting.
引用
收藏
页数:9
相关论文
共 50 条
  • [21] On the Structure of Two-Layer Cellular Neural Networks
    Ban, Jung-Chao
    Chang, Chih-Hung
    Lin, Song-Sun
    DIFFERENTIAL AND DIFFERENCE EQUATIONS WITH APPLICATI ONS, 2013, 47 : 265 - 273
  • [22] Expressive Numbers of Two or More Hidden Layer ReLU Neural Networks
    Inoue, Kenta
    2019 SEVENTH INTERNATIONAL SYMPOSIUM ON COMPUTING AND NETWORKING WORKSHOPS (CANDARW 2019), 2019, : 129 - 135
  • [23] Adaptive two-layer ReLU neural network: II. Ritz approximation to elliptic PDEs *
    Liu, Min
    Cai, Zhiqiang
    COMPUTERS & MATHEMATICS WITH APPLICATIONS, 2022, 113 : 103 - 116
  • [24] Two-layer modeling for local area networks.
    Murata, Masayuki
    Takagi, Hideaki
    IEEE Transactions on Communications, 1988, 36 (09): : 1022 - 1034
  • [25] Templates and algorithms for two-layer cellular neural Networks
    Yang, ZH
    Nishio, Y
    Ushida, A
    PROCEEDING OF THE 2002 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, VOLS 1-3, 2002, : 1946 - 1951
  • [26] Two-Layer Feedback Neural Networks with Associative Memories
    Wu Gui-Kun
    Zhao Hong
    CHINESE PHYSICS LETTERS, 2008, 25 (11) : 3871 - 3874
  • [27] Two-layer stabilization of continuous neural networks with feedbacks
    Dudnikov, EE
    CYBERNETICS AND SYSTEMS, 2002, 33 (04) : 325 - 340
  • [28] Structural synthesis of fast two-layer neural networks
    A. Yu. Dorogov
    Cybernetics and Systems Analysis, 2000, 36 : 512 - 519
  • [29] Benign Overfitting in Two-layer Convolutional Neural Networks
    Cao, Yuan
    Chen, Zixiang
    Belkin, Mikhail
    Gu, Quanquan
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35, NEURIPS 2022, 2022,
  • [30] On the symmetries in the dynamics of wide two-layer neural networks
    Hajjar, Karl
    Chizat, Lenaic
    ELECTRONIC RESEARCH ARCHIVE, 2023, 31 (04): : 2175 - 2212