Spurious Local Minima are Common in Two-Layer ReLU Neural Networks

被引:0
|
作者
Safran, Itay [1 ]
Shamir, Ohad [1 ]
机构
[1] Weizmann Inst Sci, Rehovot, Israel
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
We consider the optimization problem associated with training simple ReLU neural networks of the form x bar right arrow Sigma(k)(i=1)max{0, w(i)(inverted perpendicular) x} with respect to the squared loss. We provide a computer-assisted proof that even if the input distribution is standard Gaussian, even if the dimension is arbitrarily large, and even if the target values are generated by such a network, with orthonormal parameter vectors, the problem can still have spurious local minima once 6 <= k <= 20. By a concentration of measure argument, this implies that in high input dimensions, nearly all target networks of the relevant sizes lead to spurious local minima Moreover, we conduct experiments which show that the probability of hitting such local minima is quite high, and increasing with the network size. On the positive side, mild over-parameterization appears to drastically reduce such local minima, indicating that an over-parameterization assumption is necessary to get a positive result in this setting.
引用
收藏
页数:9
相关论文
共 50 条
  • [41] Efficient Global Optimization of Two-Layer ReLU Networks: Quadratic-Time Algorithms and Adversarial Training
    Bai, Yatong
    Gautam, Tanmay
    Sojoudi, Somayeh
    SIAM JOURNAL ON MATHEMATICS OF DATA SCIENCE, 2023, 5 (02): : 446 - 474
  • [42] APPROXIMATION PROPERTIES OF SOME TWO-LAYER FEEDFORWARD NEURAL NETWORKS
    Nowak, Michal A.
    OPUSCULA MATHEMATICA, 2007, 27 (01) : 59 - 72
  • [43] Capacity of two-layer feedforward neural networks with binary weights
    Ji, CY
    Psaltis, D
    IEEE TRANSACTIONS ON INFORMATION THEORY, 1998, 44 (01) : 256 - 268
  • [44] A PRIORI ESTIMATES OF THE POPULATION RISK FOR TWO-LAYER NEURAL NETWORKS
    E, Weinan
    Ma, Chao
    Wu, Lei
    COMMUNICATIONS IN MATHEMATICAL SCIENCES, 2019, 17 (05) : 1407 - 1425
  • [45] l1 Regularization in Two-Layer Neural Networks
    Li, Gen
    Gu, Yuantao
    Ding, Jie
    IEEE SIGNAL PROCESSING LETTERS, 2022, 29 : 135 - 139
  • [46] A mean field view of the landscape of two-layer neural networks
    Mei, Song
    Montanari, Andrea
    Phan-Minh Nguyen
    PROCEEDINGS OF THE NATIONAL ACADEMY OF SCIENCES OF THE UNITED STATES OF AMERICA, 2018, 115 (33) : E7665 - E7671
  • [47] Local Identifiability of Deep ReLU Neural Networks: the Theory
    Bona-Pellissier, Joachim
    Malgouyres, Francois
    Bachoc, Francois
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35 (NEURIPS 2022), 2022,
  • [48] Local minima and plateaus in multilayer neural networks
    Fukumizu, K
    Amari, S
    NINTH INTERNATIONAL CONFERENCE ON ARTIFICIAL NEURAL NETWORKS (ICANN99), VOLS 1 AND 2, 1999, (470): : 597 - 602
  • [49] ON THE PROBLEM OF LOCAL MINIMA IN RECURRENT NEURAL NETWORKS
    BIANCHINI, M
    GORI, M
    MAGGINI, M
    IEEE TRANSACTIONS ON NEURAL NETWORKS, 1994, 5 (02): : 167 - 172
  • [50] Synchronizability of two-layer networks
    Mingming Xu
    Jin Zhou
    Jun-an Lu
    Xiaoqun Wu
    The European Physical Journal B, 2015, 88