Generalization and selection of examples in feedforward neural networks

被引:26
|
作者
Franco, L [1 ]
Cannas, SA [1 ]
机构
[1] Univ Nacl Cordoba, Fac Matemat Astron & Fis, RA-5000 Cordoba, Argentina
关键词
D O I
10.1162/089976600300014999
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In this work, we study how the selection of examples affects the learning procedure in a boolean neural network and its relationship with the complexity of the function under study and its architecture. We analyze the generalization capacity for different target functions with particular architectures through an analytical calculation of the minimum number of examples needed to obtain full generalization (i.e., zero generalization error). The analysis of the training sets associated with such parameter leads us to propose a general architecture-independent criterion for selection of training examples. The criterion was checked through numerical simulations for various particular target functions with particular architectures, as well as for random target functions in a nonoverlapping receptive field perceptron. In all cases, the selection sampling criterion lead to an improvement in the generalization capacity compared with a pure random sampling. We also show that for the parity problem, one of the most used problems for testing learning algorithms, only the use of the whole set of examples ensures global learning in a depth two architecture. We show that this difficulty can be overcome by considering a tree-structured network of depth 2 log(2) (N) - 1.
引用
收藏
页码:2405 / 2426
页数:22
相关论文
共 50 条
  • [31] Generalization theory and generalization methods for neural networks
    Wei, Hai-Kun
    Xu, Si-Xin
    Song, Wen-Zhong
    Zidonghua Xuebao/Acta Automatica Sinica, 2001, 27 (06): : 806 - 815
  • [32] Feature selection using double parallel feedforward neural networks and particle swarm optimization
    Huang, Rui
    He, Mingyi
    2007 IEEE CONGRESS ON EVOLUTIONARY COMPUTATION, VOLS 1-10, PROCEEDINGS, 2007, : 692 - 696
  • [33] Feature selection for hyperspectral data classification using double parallel feedforward neural networks
    He, MY
    Huang, R
    FUZZY SYSTEMS AND KNOWLEDGE DISCOVERY, PT 2, PROCEEDINGS, 2005, 3614 : 58 - 66
  • [34] Oscillation Characteristics of Feedforward Neural Networks
    Li, Yudi
    Wu, Aiguo
    Dong, Na
    Du, Lijia
    Chai, Yi
    2018 13TH WORLD CONGRESS ON INTELLIGENT CONTROL AND AUTOMATION (WCICA), 2018, : 1074 - 1079
  • [35] Randomized Algorithms for Feedforward Neural Networks
    Li Fan-jun
    Li Ying
    PROCEEDINGS OF THE 35TH CHINESE CONTROL CONFERENCE 2016, 2016, : 3664 - 3668
  • [36] A Selection of Starting Points for Iterative Position Estimation Algorithms Using Feedforward Neural Networks
    Sadowski, Jaroslaw
    Stefanski, Jacek
    SENSORS, 2024, 24 (02)
  • [37] Feedforward neural networks without orthonormalization
    Chen, Lei
    Pung, Hung Keng
    Long, Fei
    ICEIS 2007: PROCEEDINGS OF THE NINTH INTERNATIONAL CONFERENCE ON ENTERPRISE INFORMATION SYSTEMS: ARTIFICIAL INTELLIGENCE AND DECISION SUPPORT SYSTEMS, 2007, : 420 - 423
  • [38] Feedforward neural networks for compound signals
    Szczuka, Marcin
    Slezak, Dominik
    THEORETICAL COMPUTER SCIENCE, 2011, 412 (42) : 5960 - 5973
  • [39] Interpolation functions of feedforward neural networks
    Li, HX
    Lee, ES
    COMPUTERS & MATHEMATICS WITH APPLICATIONS, 2003, 46 (12) : 1861 - 1874
  • [40] Channel equalization by feedforward neural networks
    Lu, B
    Evans, BL
    ISCAS '99: PROCEEDINGS OF THE 1999 IEEE INTERNATIONAL SYMPOSIUM ON CIRCUITS AND SYSTEMS, VOL 5: SYSTEMS, POWER ELECTRONICS, AND NEURAL NETWORKS, 1999, : 587 - 590