Back-propagation is not efficient

被引:22
|
作者
Sima, J
机构
[1] Acad. of Sci. of the Czech Republic
[2] Dept. of Theoretical Informatics, Institute of Computer Science, Acad. of Sci. of the Czech Republic, 182 07 Prague 8
关键词
learning theory; loading problem; NP-hardness; standard sigmoid function; back-propagation; nonlinear programming;
D O I
10.1016/0893-6080(95)00135-2
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The back-propagation learning algorithm for multi-layered neural networks, which is often successfully used in practice, appears very time consuming even for small network architectures or training tasks. However, no results are yet known concerning the complexity of this algorithm. Blum and Rivest proved that training even a three-node network is NP-complete for the case when a neuron computes the discrete linear threshold function. We generalize the technique from their NP-hardness proof for a continuous sigmoidal function used in back-propagation. We show that training a three-node sigmoid network with an additional constraint on the output neuron function (e.g., zero threshold) is NP-hard. As a consequence of this, we find training sigmoid feedforward networks, with a single hidden layer and with zero threshold of output neuron, to be intractable. This implies that back-propagation is generally not an efficient algorithm, unless at least P = NP. We rake advantage of these results by showing the NP-hardness of a special nonlinear programming problem. Copyright (C) 1996 Elsevier Science Ltd.
引用
收藏
页码:1017 / 1023
页数:7
相关论文
共 50 条
  • [1] BACK-PROPAGATION
    JONES, WP
    HOSKINS, J
    BYTE, 1987, 12 (11): : 155 - &
  • [2] Back-propagation of accuracy
    Senashova, MY
    Gorban, AN
    Wunsch, DC
    1997 IEEE INTERNATIONAL CONFERENCE ON NEURAL NETWORKS, VOLS 1-4, 1997, : 1998 - 2001
  • [3] Back-propagation with Chaos
    Fazayeli, Farideh
    Wang, Lipo
    Liu, Wen
    2008 INTERNATIONAL CONFERENCE ON NEURAL NETWORKS AND SIGNAL PROCESSING, VOLS 1 AND 2, 2007, : 5 - 8
  • [4] Sequential Back-Propagation
    王晖
    刘大有
    王亚飞
    Journal of Computer Science and Technology, 1994, (03) : 252 - 260
  • [5] Improving back-propagation: Epsilon-back-propagation
    Trejo, LA
    Sandoval, C
    FROM NATURAL TO ARTIFICIAL NEURAL COMPUTATION, 1995, 930 : 427 - 432
  • [6] A HIGHLY EFFICIENT IMPLEMENTATION OF BACK-PROPAGATION ALGORITHM ON SIMD COMPUTERS
    CORANA, A
    ROLANDO, C
    RIDELLA, S
    HIGH PERFORMANCE COMPUTING /, 1989, : 181 - 190
  • [7] FEATURE CONSTRUCTION FOR BACK-PROPAGATION
    PIRAMUTHU, S
    LECTURE NOTES IN COMPUTER SCIENCE, 1991, 496 : 264 - 268
  • [8] On the Local Hessian in Back-propagation
    Zhang, Huishuai
    Chen, Wei
    Liu, Tie-Yan
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 31 (NIPS 2018), 2018, 31
  • [9] Localized back-propagation network
    Kongzhi yu Juece Control Decis, 2 (152):
  • [10] Digital Back-propagation for Unrepeatered Transmission
    Lavery, Domanic
    30TH ANNUAL CONFERENCE OF THE IEEE PHOTONICS SOCIETY (IPC), 2017, : 369 - 370