Training neural networks with structured noise improves classification and generalization

被引:0
|
作者
Benedetti, Marco [1 ]
Ventura, Enrico [1 ,2 ]
机构
[1] Sapienza Univ Roma, Dipartimento Fis, Ple A Moro 2, I-00185 Rome, Italy
[2] Univ PSL, Ecole Normale Super, Lab Phys, ENS, F-75005 Paris, France
关键词
recurrent neural networks; perceptron learning; unlearning; associative memory; PATTERNS; STORAGE; SPACE; MODEL;
D O I
10.1088/1751-8121/ad7b8f
中图分类号
O4 [物理学];
学科分类号
0702 ;
摘要
The beneficial role of noise-injection in learning is a consolidated concept in the field of artificial neural networks, suggesting that even biological systems might take advantage of similar mechanisms to optimize their performance. The training-with-noise (TWN) algorithm proposed by Gardner and collaborators is an emblematic example of a noise-injection procedure in recurrent networks, which can be used to model biological neural systems. We show how adding structure to noisy training data can substantially improve the algorithm performance, allowing the network to approach perfect retrieval of the memories and wide basins of attraction, even in the scenario of maximal injected noise. We also prove that the so-called Hebbian Unlearning rule coincides with the TWN algorithm when noise is maximal and data are stable fixed points of the network dynamics.
引用
收藏
页数:26
相关论文
共 50 条
  • [21] Environmental Noise Classification Using Convolution Neural Networks
    Li, Mengyuan
    Gao, Zhenbin
    Zang, Xinzhe
    Wang, Xia
    PROCEEDINGS OF 2018 INTERNATIONAL CONFERENCE ON ELECTRONICS AND ELECTRICAL ENGINEERING TECHNOLOGY (EEET 2018), 2018, : 182 - 185
  • [22] Convolutional Neural Networks for Noise Classification and Denoising of Images
    Sil, Dibakar
    Dutta, Arindam
    Chandra, Aniruddha
    PROCEEDINGS OF THE 2019 IEEE REGION 10 CONFERENCE (TENCON 2019): TECHNOLOGY, KNOWLEDGE, AND SOCIETY, 2019, : 447 - 451
  • [23] Towards Understanding the Importance of Noise in Training Neural Networks
    Zhou, Mo
    Liu, Tianyi
    Li, Yan
    Lin, Dachao
    Zhou, Enlu
    Zhao, Tuo
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 97, 2019, 97
  • [24] Training neural networks with additive noise in the desired signal
    Wang, CA
    Principe, JC
    IEEE TRANSACTIONS ON NEURAL NETWORKS, 1999, 10 (06): : 1511 - 1517
  • [25] Training neural networks with additive noise in the desired signal
    Wang, C
    Principe, JC
    IEEE WORLD CONGRESS ON COMPUTATIONAL INTELLIGENCE, 1998, : 1084 - 1089
  • [26] Selection of minimum training data for generalization and on-line training by multilayer neural networks
    Hara, K
    Nakayama, K
    ICNN - 1996 IEEE INTERNATIONAL CONFERENCE ON NEURAL NETWORKS, VOLS. 1-4, 1996, : 436 - 441
  • [27] Neural network training fingerprint: visual analytics of the training process in classification neural networks
    Ferreira, Martha Dais
    Cantareira, Gabriel D.
    de Mello, Rodrigo F.
    Paulovich, Fernando V.
    JOURNAL OF VISUALIZATION, 2022, 25 (03) : 593 - 612
  • [28] Neural network training fingerprint: visual analytics of the training process in classification neural networks
    Martha Dais Ferreira
    Gabriel D. Cantareira
    Rodrigo F. de Mello
    Fernando V. Paulovich
    Journal of Visualization, 2022, 25 : 593 - 612
  • [29] INVESTIGATING GENERALIZATION IN NEURAL NETWORKS UNDER OPTIMALLY EVOLVED TRAINING PERTURBATIONS
    Chaudhury, Subhajit
    Yamasaki, Toshihiko
    2020 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH, AND SIGNAL PROCESSING, 2020, : 3617 - 3621
  • [30] Training Algorithm Performance for Image Classification by Neural Networks
    Zhou, Libin
    Yang, Xiaojun
    PHOTOGRAMMETRIC ENGINEERING AND REMOTE SENSING, 2010, 76 (08): : 945 - 951