Spurious Local Minima are Common in Two-Layer ReLU Neural Networks

被引:0
|
作者
Safran, Itay [1 ]
Shamir, Ohad [1 ]
机构
[1] Weizmann Inst Sci, Rehovot, Israel
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
We consider the optimization problem associated with training simple ReLU neural networks of the form x bar right arrow Sigma(k)(i=1)max{0, w(i)(inverted perpendicular) x} with respect to the squared loss. We provide a computer-assisted proof that even if the input distribution is standard Gaussian, even if the dimension is arbitrarily large, and even if the target values are generated by such a network, with orthonormal parameter vectors, the problem can still have spurious local minima once 6 <= k <= 20. By a concentration of measure argument, this implies that in high input dimensions, nearly all target networks of the relevant sizes lead to spurious local minima Moreover, we conduct experiments which show that the probability of hitting such local minima is quite high, and increasing with the network size. On the positive side, mild over-parameterization appears to drastically reduce such local minima, indicating that an over-parameterization assumption is necessary to get a positive result in this setting.
引用
收藏
页数:9
相关论文
共 50 条
  • [1] Annihilation of Spurious Minima in Two-Layer ReLU Networks
    Arjevani, Yossi
    Field, Michael
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35, NEURIPS 2022, 2022,
  • [2] Analytic Study of Families of Spurious Minima in Two-Layer ReLU Neural Networks: A Tale of Symmetry II
    Arjevani, Yossi
    Field, Michael
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021, 34
  • [3] HIDDEN MINIMA IN TWO-LAYER RELU NETWORKS
    Arjevani, Yossi
    arXiv, 2023,
  • [4] Convergence Analysis of Two-layer Neural Networks with ReLU Activation
    Li, Yuanzhi
    Yuan, Yang
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 30 (NIPS 2017), 2017, 30
  • [5] A Global Universality of Two-Layer Neural Networks with ReLU Activations
    Hatano, Naoya
    Ikeda, Masahiro
    Ishikawa, Isao
    Sawano, Yoshihiro
    JOURNAL OF FUNCTION SPACES, 2021, 2021
  • [6] Learning behavior and temporary minima of two-layer neural networks
    Annema, Anne-Johan
    Hoen, Klaas
    Wallinga, Hans
    Neural Networks, 1994, 7 (09): : 1387 - 1404
  • [7] Spurious Local Minima Are Common for Deep Neural Networks With Piecewise Linear Activations
    Liu, Bo
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2024, 35 (04) : 5382 - 5394
  • [8] Provable Identifiability of Two-Layer ReLU Neural Networks via LASSO Regularization
    Li G.
    Wang G.
    Ding J.
    IEEE Transactions on Information Theory, 2023, 69 (09) : 5921 - 5935
  • [9] Phase diagram for two-layer ReLU neural networks at infinite-width limit
    Luo, Tao
    Xu, Zhi-Qin John
    Ma, Zheng
    Zhang, Yaoyu
    Journal of Machine Learning Research, 2021, 22
  • [10] Phase Diagram for Two-layer ReLU Neural Networks at Infinite-width Limit
    Luo, Tao
    Xu, Zhi-Qin John
    Ma, Zheng
    Zhang, Yaoyu
    JOURNAL OF MACHINE LEARNING RESEARCH, 2021, 22 : 1 - 47