Training Fully Connected Neural Networks is ∃R-Complete

被引:0
|
作者
Bertschinger, Daniel [1 ]
Hertrich, Christoph [2 ,5 ,6 ]
Jungeblut, Paul [3 ]
Miltzow, Tillmann [4 ]
Weber, Simon [1 ]
机构
[1] Swiss Fed Inst Technol, Dept Comp Sci, Zurich, Switzerland
[2] London Sch Econ & Polit Sci, Dept Math, London, England
[3] Karlsruhe Inst Technol, Inst Theoret Informat, Karlsruhe, Germany
[4] Univ Utrecht, Dept Informat & Comp Sci, Utrecht, Netherlands
[5] Univ Libre, Brussels, Belgium
[6] Goethe Univ, Frankfurt, Germany
基金
瑞士国家科学基金会; 欧洲研究理事会;
关键词
BOUNDS;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
We consider the algorithmic problem of finding the optimal weights and biases for a two-layer fully connected neural network to fit a given set of data points, also known as empirical risk minimization. We show that the problem is there exists R-complete. This complexity class can be defined as the set of algorithmic problems that are polynomial-time equivalent to finding real roots of a multivariate polynomial with integer coefficients. Furthermore, we show that arbitrary algebraic numbers are required as weights to be able to train some instances to optimality, even if all data points are rational. Our result already applies to fully connected instances with two inputs, two outputs, and one hidden layer of ReLU neurons. Thereby, we strengthen a result by Abrahamsen, Kleist and Miltzow [NeurIPS 2021]. A consequence of this is that a combinatorial search algorithm like the one by Arora, Basu, Mianjy and Mukherjee [ICLR 2018] is impossible for networks with more than one output dimension, unless NP = there exists R.
引用
收藏
页数:16
相关论文
共 50 条
  • [11] On the BP Training Algorithm of Fuzzy Neural Networks (FNNs) via Its Equivalent Fully Connected Neural Networks (FFNNs)
    Wang, Jing
    Wang, Chi-Hsu
    Chen, C. L. Philip
    2011 IEEE INTERNATIONAL CONFERENCE ON SYSTEMS, MAN, AND CYBERNETICS (SMC), 2011, : 1376 - 1381
  • [12] On the Learnability of Fully-connected Neural Networks
    Zhang, Yuchen
    Lee, Jason D.
    Wainwright, Martin J.
    Jordan, Michael I.
    ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 54, 2017, 54 : 83 - 91
  • [13] Further ∃R-Complete Problems with PSD Matrix Factorizations
    Shitov, Yaroslav
    FOUNDATIONS OF COMPUTATIONAL MATHEMATICS, 2024, 24 (04) : 1225 - 1248
  • [14] Automatic model selection for fully connected neural networks
    Laredo D.
    Ma S.F.
    Leylaz G.
    Schütze O.
    Sun J.-Q.
    International Journal of Dynamics and Control, 2020, 8 (04) : 1063 - 1079
  • [15] Data Symmetries and Learning in Fully Connected Neural Networks
    Anselmi, Fabio
    Manzoni, Luca
    D'onofrio, Alberto
    Rodriguez, Alex
    Caravagna, Giulio
    Bortolussi, Luca
    Cairoli, Francesca
    IEEE ACCESS, 2023, 11 : 47282 - 47290
  • [16] RAC-Drawability is ∃R-complete and Related Results
    Schaefer M.
    Journal of Graph Algorithms and Applications, 2023, 27 (09) : 803 - 841
  • [17] Q-PSEUDOCONVEX DOMAIN IN R-COMPLETE MANIFOLD
    CHEN, ZH
    LU, ZQ
    CHINESE SCIENCE BULLETIN, 1990, 35 (05): : 366 - 370
  • [18] On the Conjugate Gradients (CG) Training Algorithm of Fuzzy Neural Networks (FNNs) via Its Equivalent Fully Connected Neural Networks (FFNNs)
    Wang, Jing
    Chen, C. L. Philip
    Wang, Chi-Hsu
    PROCEEDINGS 2012 IEEE INTERNATIONAL CONFERENCE ON SYSTEMS, MAN, AND CYBERNETICS (SMC), 2012, : 2446 - 2451
  • [19] EQUIVALENCE OF APPROXIMATION BY CONVOLUTIONAL NEURAL NETWORKS AND FULLY-CONNECTED NETWORKS
    Petersen, Philipp
    Voigtlaender, Felix
    PROCEEDINGS OF THE AMERICAN MATHEMATICAL SOCIETY, 2020, 148 (04) : 1567 - 1581
  • [20] Cohomology groups of locally q-complete morphisms with r-complete base
    Vajaitu, V
    MATHEMATICA SCANDINAVICA, 1996, 79 (02) : 161 - 175