Generalization and selection of examples in feedforward neural networks

被引:26
|
作者
Franco, L [1 ]
Cannas, SA [1 ]
机构
[1] Univ Nacl Cordoba, Fac Matemat Astron & Fis, RA-5000 Cordoba, Argentina
关键词
D O I
10.1162/089976600300014999
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In this work, we study how the selection of examples affects the learning procedure in a boolean neural network and its relationship with the complexity of the function under study and its architecture. We analyze the generalization capacity for different target functions with particular architectures through an analytical calculation of the minimum number of examples needed to obtain full generalization (i.e., zero generalization error). The analysis of the training sets associated with such parameter leads us to propose a general architecture-independent criterion for selection of training examples. The criterion was checked through numerical simulations for various particular target functions with particular architectures, as well as for random target functions in a nonoverlapping receptive field perceptron. In all cases, the selection sampling criterion lead to an improvement in the generalization capacity compared with a pure random sampling. We also show that for the parity problem, one of the most used problems for testing learning algorithms, only the use of the whole set of examples ensures global learning in a depth two architecture. We show that this difficulty can be overcome by considering a tree-structured network of depth 2 log(2) (N) - 1.
引用
收藏
页码:2405 / 2426
页数:22
相关论文
共 50 条
  • [21] ON TRAINING FEEDFORWARD NEURAL NETWORKS
    KAK, S
    PRAMANA-JOURNAL OF PHYSICS, 1993, 40 (01): : 35 - 42
  • [22] The capacity of feedforward neural networks
    Baldi, Pierre
    Vershynin, Roman
    NEURAL NETWORKS, 2019, 116 : 288 - 311
  • [23] Optimization of feedforward neural networks
    Han, J
    Moraga, C
    Sinne, S
    ENGINEERING APPLICATIONS OF ARTIFICIAL INTELLIGENCE, 1996, 9 (02) : 109 - 119
  • [24] PROPERTIES OF FEEDFORWARD NEURAL NETWORKS
    BUDINICH, M
    MILOTTI, E
    JOURNAL OF PHYSICS A-MATHEMATICAL AND GENERAL, 1992, 25 (07): : 1903 - 1914
  • [25] DYNAMIC ACTION CLASSIFICATION BASED ON ITERATIVE DATA SELECTION AND FEEDFORWARD NEURAL NETWORKS
    Iostfidis, Alexandros
    Tefas, Anastasios
    Pitas, Ioannis
    2013 PROCEEDINGS OF THE 21ST EUROPEAN SIGNAL PROCESSING CONFERENCE (EUSIPCO), 2013,
  • [26] GENERALIZATION BY NEURAL NETWORKS
    SHEKHAR, S
    AMIN, MB
    IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 1992, 4 (02) : 177 - 185
  • [27] On generalization by neural networks
    Kak, SC
    INFORMATION SCIENCES, 1998, 111 (1-4) : 293 - 302
  • [28] Model structure selection for nonlinear system identification using feedforward neural networks
    Petrovic, I
    Baotic, M
    Peric, N
    IJCNN 2000: PROCEEDINGS OF THE IEEE-INNS-ENNS INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, VOL I, 2000, : 53 - 57
  • [29] Generalization capability of feedforward neural network for pattern recognition
    Huang, Deshuang
    Journal of Beijing Institute of Technology (English Edition), 1996, 5 (02): : 184 - 192
  • [30] A pseudoinverse learning algorithm for feedforward neural networks with stacked generalization applications to software reliability growth data
    Guo, P
    Lyu, MR
    NEUROCOMPUTING, 2004, 56 : 101 - 121