Back-propagation is not efficient

被引:22
|
作者
Sima, J
机构
[1] Acad. of Sci. of the Czech Republic
[2] Dept. of Theoretical Informatics, Institute of Computer Science, Acad. of Sci. of the Czech Republic, 182 07 Prague 8
关键词
learning theory; loading problem; NP-hardness; standard sigmoid function; back-propagation; nonlinear programming;
D O I
10.1016/0893-6080(95)00135-2
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The back-propagation learning algorithm for multi-layered neural networks, which is often successfully used in practice, appears very time consuming even for small network architectures or training tasks. However, no results are yet known concerning the complexity of this algorithm. Blum and Rivest proved that training even a three-node network is NP-complete for the case when a neuron computes the discrete linear threshold function. We generalize the technique from their NP-hardness proof for a continuous sigmoidal function used in back-propagation. We show that training a three-node sigmoid network with an additional constraint on the output neuron function (e.g., zero threshold) is NP-hard. As a consequence of this, we find training sigmoid feedforward networks, with a single hidden layer and with zero threshold of output neuron, to be intractable. This implies that back-propagation is generally not an efficient algorithm, unless at least P = NP. We rake advantage of these results by showing the NP-hardness of a special nonlinear programming problem. Copyright (C) 1996 Elsevier Science Ltd.
引用
收藏
页码:1017 / 1023
页数:7
相关论文
共 50 条
  • [21] CT image reconstruction by back-propagation
    Nakao, Z
    Ali, FEF
    Chen, YW
    FIRST INTERNATIONAL CONFERENCE ON KNOWLEDGE-BASED INTELLIGENT ELECTRONIC SYSTEMS, PROCEEDINGS 1997 - KES '97, VOLS 1 AND 2, 1997, : 323 - 326
  • [22] Theories of Error Back-Propagation in the Brain
    Whittington, James C. R.
    Bogacz, Rafal
    TRENDS IN COGNITIVE SCIENCES, 2019, 23 (03) : 235 - 250
  • [23] A parallel back-propagation adder structure
    Herrfeld, A
    Hentschke, S
    INTERNATIONAL JOURNAL OF ELECTRONICS, 1998, 85 (03) : 273 - 291
  • [24] Alternating Back-Propagation for Generator Network
    Han, Tian
    Lu, Yang
    Zhu, Song-Chun
    Wu, Ying Nian
    THIRTY-FIRST AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2017, : 1976 - 1984
  • [25] Truncated Back-propagation for Bilevel Optimization
    Shaban, Amirreza
    Cheng, Ching-An
    Hatch, Nathan
    Boots, Byron
    22ND INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 89, 2019, 89
  • [26] Back-propagation extreme learning machine
    Weidong Zou
    Yuanqing Xia
    Weipeng Cao
    Soft Computing, 2022, 26 : 9179 - 9188
  • [27] Back-propagation as reinforcement in prediction tasks
    Grüning, A
    ARTIFICIAL NEURAL NETWORKS: FORMAL MODELS AND THEIR APPLICATIONS - ICANN 2005, PT 2, PROCEEDINGS, 2005, 3697 : 547 - 552
  • [28] WS-BP: An Efficient Wolf Search Based Back-Propagation Algorithm
    Nawi, Nazri Mohd
    Rehman, M. Z.
    Khan, Abdullah
    INTERNATIONAL CONFERENCE ON MATHEMATICS, ENGINEERING AND INDUSTRIAL APPLICATIONS 2014 (ICOMEIA 2014), 2015, 1660
  • [29] Chicken S-BP: An Efficient Chicken Swarm Based Back-Propagation Algorithm
    Khan, Abdullah
    Nawi, Nazri Mohd
    Shah, Rahmat
    Akhter, Nasreen
    Ullah, Atta
    Rehman, M. Z.
    AbdulHamid, Norhamreeza
    Chiroma, Haruna
    RECENT ADVANCES ON SOFT COMPUTING AND DATA MINING, 2017, 549 : 122 - 129
  • [30] GPU Implementation of the Multiple Back-Propagation Algorithm
    Lopes, Noel
    Ribeiro, Bernardete
    INTELLIGENT DATA ENGINEERING AND AUTOMATED LEARNING, PROCEEDINGS, 2009, 5788 : 449 - 456