GENERALIZATION IN A 2-LAYER NEURAL-NETWORK

被引:27
|
作者
KANG, KJ [1 ]
OH, JH [1 ]
KWON, C [1 ]
PARK, Y [1 ]
机构
[1] MYONG JI UNIV,DEPT PHYS,YONGIN,SOUTH KOREA
来源
PHYSICAL REVIEW E | 1993年 / 48卷 / 06期
关键词
D O I
10.1103/PhysRevE.48.4805
中图分类号
O35 [流体力学]; O53 [等离子体物理学];
学科分类号
070204 ; 080103 ; 080704 ;
摘要
Generalization in a fully connected two-layer neural network with N input nodes, M hidden nodes, a single output node, and binary weights is studied in the annealed approximation. When the number of examples is the order of N, the generalization error approaches a plateau and the system is in a permutation symmetric phase. When the number of examples is of the order of MN, the system undergoes a first-order phase transition to perfect generalisation and the permutation symmetry breaks. Results of the computer simulation show good agreement with analytic calculation
引用
收藏
页码:4805 / 4809
页数:5
相关论文
共 50 条
  • [31] NEURAL-NETWORK HEURISTICS
    JOSIN, G
    BYTE, 1987, 12 (11): : 183 - &
  • [33] Neural-network wavefunctions
    McCardle, Kaitlin
    NATURE COMPUTATIONAL SCIENCE, 2022, 2 (05): : 284 - 284
  • [34] Role of stochastic noise and generalization error in the time propagation of neural-network quantum states
    Hofmann, Damian
    Fabiani, Giammarco
    Mentink, Johan H.
    Carleo, Giuseppe
    Sentef, Michael A.
    SCIPOST PHYSICS, 2022, 12 (05):
  • [35] Forecasting of the chaos by iterations including multi-layer neural-network
    Aoyama, T
    Zhu, HX
    Yoshihara, I
    IJCNN 2000: PROCEEDINGS OF THE IEEE-INNS-ENNS INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, VOL IV, 2000, : 467 - 471
  • [36] Feature Selection Based on a Sparse Neural-Network Layer With Normalizing Constraints
    Bugata, Peter
    Drotar, Peter
    IEEE TRANSACTIONS ON CYBERNETICS, 2023, 53 (01) : 161 - 172
  • [37] Generalization in a two-layer neural network with multiple outputs
    Kang, KJ
    Oh, JH
    Kwon, C
    Park, Y
    PHYSICAL REVIEW E, 1996, 54 (02): : 1811 - 1815
  • [38] On the Effect of Initialization: The Scaling Path of 2-Layer Neural Networks
    Neumayer, Sebastian
    Chizat, Lenaic
    Unser, Michael
    JOURNAL OF MACHINE LEARNING RESEARCH, 2024, 25
  • [39] STORAGE CAPACITY AND LEARNING ALGORITHMS FOR 2-LAYER NEURAL NETWORKS
    ENGEL, A
    KOHLER, HM
    TSCHEPKE, F
    VOLLMAYR, H
    ZIPPELIUS, A
    PHYSICAL REVIEW A, 1992, 45 (10): : 7590 - 7609
  • [40] Understanding the Generalization Power of Overfitted NTK Models: 3-layer vs. 2-layer
    Ju, Peizhong
    Lin, Xiaojun
    Shroff, Ness B.
    2022 58TH ANNUAL ALLERTON CONFERENCE ON COMMUNICATION, CONTROL, AND COMPUTING (ALLERTON), 2022,