ONLINE LEARNING WITH MINIMAL DEGRADATION IN FEEDFORWARD NETWORKS

被引:15
|
作者
DEANGULO, VR [1 ]
TORRAS, C [1 ]
机构
[1] INST CIBERNET,E-08028 BARCELONA,SPAIN
来源
IEEE TRANSACTIONS ON NEURAL NETWORKS | 1995年 / 6卷 / 03期
关键词
D O I
10.1109/72.377971
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Dealing with nonstationary processes requires quick adaptation while at the same time avoiding catastrophic forgetting, A neural learning technique that satisfies these requirements, without sacrifying the benefits of distributed representations, is presented, It relies on a formalization of the problem as the minimization of the error over the previously learned input-output (i-o) patterns, subject to the constraint of perfect encoding of the new pattern, Then this constrained optimization problem is transformed into an unconstrained one with hidden-unit activations as variables. This new formulation naturally leads to an algorithm for solving the problem, which we call learning with minimal degradation (LMD), Some experimental comparisons of the performance of LMD with back propagation are provided which, besides showing the advantages of using LMD, reveal the dependence of forgetting on the learning fate in backpropagation. We also explain why overtraining affects forgetting and fault tolerance, which are seen as related problems.
引用
收藏
页码:657 / 668
页数:12
相关论文
共 50 条
  • [31] Learning algorithms for feedforward networks based on finite samples
    Rao, NSV
    Protopopescu, V
    Mann, RC
    Oblow, EM
    Iyengar, SS
    IEEE TRANSACTIONS ON NEURAL NETWORKS, 1996, 7 (04): : 926 - 940
  • [32] A FAST AND ROBUST LEARNING ALGORITHM FOR FEEDFORWARD NEURAL NETWORKS
    WEYMAERE, N
    MARTENS, JP
    NEURAL NETWORKS, 1991, 4 (03) : 361 - 369
  • [33] Learning as a multi - Objective optimization in feedforward neural networks
    Dumitras, A
    Lazarescu, V
    Negoita, M
    FIRST INTERNATIONAL CONFERENCE ON KNOWLEDGE-BASED INTELLIGENT ELECTRONIC SYSTEMS, PROCEEDINGS 1997 - KES '97, VOLS 1 AND 2, 1997, : 588 - 593
  • [34] Learning in Feedforward Neural Networks Accelerated by Transfer Entropy
    Moldovan, Adrian
    Cataron, Angel
    Andonie, Razvan
    ENTROPY, 2020, 22 (01) : 102
  • [35] A learning algorithm for fault tolerant feedforward neural networks
    Hammadi, C
    Ito, H
    IEICE TRANSACTIONS ON INFORMATION AND SYSTEMS, 1997, E80D (01) : 21 - 27
  • [36] Functional data learning by Hilbert feedforward neural networks
    Zhao, Jianwei
    MATHEMATICAL METHODS IN THE APPLIED SCIENCES, 2012, 35 (17) : 2111 - 2121
  • [37] LEARNING ALGORITHM FOR FEEDFORWARD NEURAL NETWORKS WITH DISCRETE SYNAPSES
    VICENTE, CJP
    CARRABINA, J
    GARRIDO, F
    VALDERRAMA, E
    LECTURE NOTES IN COMPUTER SCIENCE, 1991, 540 : 144 - 152
  • [38] ASYMPTOTIC LEARNING IN FEEDFORWARD NETWORKS WITH BINARY SYMMETRIC CHANNELS
    Zhang, Zhenliang
    Chong, Edwin K. P.
    Pezeshki, Ali
    Moran, William
    2013 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2013, : 6610 - 6614
  • [39] ANEFFICIENT LEARNING ALGORITHM FOR BINARY FEEDFORWARD NEURAL NETWORKS
    Zhou, Jianxin
    Zeng, Xiaoqin
    Chan, Patrick P. K.
    PROCEEDINGS OF 2015 INTERNATIONAL CONFERENCE ON MACHINE LEARNING AND CYBERNETICS, VOL. 2, 2015, : 609 - 615
  • [40] A general backpropagation algorithm for feedforward neural networks learning
    Yu, XH
    Efe, MO
    Kaynak, O
    IEEE TRANSACTIONS ON NEURAL NETWORKS, 2002, 13 (01): : 251 - 254