Training neural networks with structured noise improves classification and generalization

被引:0
|
作者
Benedetti, Marco [1 ]
Ventura, Enrico [1 ,2 ]
机构
[1] Sapienza Univ Roma, Dipartimento Fis, Ple A Moro 2, I-00185 Rome, Italy
[2] Univ PSL, Ecole Normale Super, Lab Phys, ENS, F-75005 Paris, France
关键词
recurrent neural networks; perceptron learning; unlearning; associative memory; PATTERNS; STORAGE; SPACE; MODEL;
D O I
10.1088/1751-8121/ad7b8f
中图分类号
O4 [物理学];
学科分类号
0702 ;
摘要
The beneficial role of noise-injection in learning is a consolidated concept in the field of artificial neural networks, suggesting that even biological systems might take advantage of similar mechanisms to optimize their performance. The training-with-noise (TWN) algorithm proposed by Gardner and collaborators is an emblematic example of a noise-injection procedure in recurrent networks, which can be used to model biological neural systems. We show how adding structure to noisy training data can substantially improve the algorithm performance, allowing the network to approach perfect retrieval of the memories and wide basins of attraction, even in the scenario of maximal injected noise. We also prove that the so-called Hebbian Unlearning rule coincides with the TWN algorithm when noise is maximal and data are stable fixed points of the network dynamics.
引用
收藏
页数:26
相关论文
共 50 条
  • [31] Theoretical analysis and classification of training problem in neural networks
    Géczy, P
    Usui, S
    ICONIP'98: THE FIFTH INTERNATIONAL CONFERENCE ON NEURAL INFORMATION PROCESSING JOINTLY WITH JNNS'98: THE 1998 ANNUAL CONFERENCE OF THE JAPANESE NEURAL NETWORK SOCIETY - PROCEEDINGS, VOLS 1-3, 1998, : 1381 - 1384
  • [32] Filtering of spatial bias and noise inputs by spatially structured neural networks
    Masuda, Naoki
    Okada, Masato
    Aihara, Kazuyuki
    NEURAL COMPUTATION, 2007, 19 (07) : 1854 - 1870
  • [33] Generalization theory and generalization methods for neural networks
    Wei, Hai-Kun
    Xu, Si-Xin
    Song, Wen-Zhong
    Zidonghua Xuebao/Acta Automatica Sinica, 2001, 27 (06): : 806 - 815
  • [34] Generalization of Deep Neural Networks for Imbalanced Fault Classification of Machinery Using Generative Adversarial Networks
    Wang, Jinrui
    Li, Shunming
    Han, Baokun
    An, Zenghui
    Bao, Huaiqian
    Ji, Shanshan
    IEEE ACCESS, 2019, 7 : 111168 - 111180
  • [35] The classification and denoising of image noise based on deep neural networks
    Fan Liu
    Qingzeng Song
    Guanghao Jin
    Applied Intelligence, 2020, 50 : 2194 - 2207
  • [36] CLASSIFICATION OF MULTISENSOR REMOTE-SENSING IMAGES BY STRUCTURED NEURAL NETWORKS
    SERPICO, SB
    ROLI, F
    IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, 1995, 33 (03): : 562 - 578
  • [37] Structured Binary Neural Networks for Accurate Image Classification and Semantic Segmentation
    Zhuang, Bohan
    Shen, Chunhua
    Tan, Mingkui
    Liu, Lingqiao
    Reid, Ian
    2019 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR 2019), 2019, : 413 - 422
  • [38] The classification and denoising of image noise based on deep neural networks
    Liu, Fan
    Song, Qingzeng
    Jin, Guanghao
    APPLIED INTELLIGENCE, 2020, 50 (07) : 2194 - 2207
  • [39] Noise sensitivity and stability of deep neural networks for binary classification
    Jonasson, Johan
    Steif, Jeffrey E.
    Zetterqvist, Olof
    STOCHASTIC PROCESSES AND THEIR APPLICATIONS, 2023, 165 : 130 - 167
  • [40] A Study of the Effect of Noise Injection on the Training of Artificial Neural Networks
    Jiang, Yulei
    Zur, Richard M.
    Pesce, Lorenzo L.
    Drukker, Karen
    IJCNN: 2009 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, VOLS 1- 6, 2009, : 2784 - 2788