A more powerful Random Neural Network model in supervised learning applications

被引:0
|
作者
Basterrech, Sebastian [1 ]
Rubino, Gerardo [2 ]
机构
[1] VSB Tech Univ Ostrava, IT4Innovat, Ostrava, Czech Republic
[2] INRIA, Rennes, France
关键词
Random Neural Network; Supervised Learning; Pattern Recognition; Numerical Optimization; Gradient Descent;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Since the early 1990s, Random Neural Networks (RNNs) have gained importance in the Neural Networks and Queueing Networks communities. RNNs are inspired by biological neural networks and they are also an extension of open Jackson's networks in Queueing Theory. In 1993, a learning algorithm of gradient type was introduced in order to use RNNs in supervised learning tasks. This method considers only the weight connections among the neurons as adjustable parameters. All other parameters are deemed fixed during the training process. The RNN model has been successfully utilized in several types of applications such as: supervised learning problems, pattern recognition, optimization, image processing, associative memory. In this contribution we present a modification of the classic model obtained by extending the set of adjustable parameters. The modification increases the potential of the RNN model in supervised learning tasks keeping the same network topology and the same time complexity of the algorithm. We describe the new equations implementing a gradient descent learning technique for the model.
引用
收藏
页码:201 / 206
页数:6
相关论文
共 50 条
  • [21] FORCED FORMATION OF A GEOMETRICAL FEATURE SPACE BY A NEURAL-NETWORK MODEL WITH SUPERVISED LEARNING
    TAKEDA, T
    MIZOE, H
    KISHI, K
    MATSUOKA, T
    IEICE TRANSACTIONS ON FUNDAMENTALS OF ELECTRONICS COMMUNICATIONS AND COMPUTER SCIENCES, 1993, E76A (07) : 1129 - 1132
  • [22] Self-Supervised Representation Learning on Neural Network Weights for Model Characteristic Prediction
    Schurholt, Konstantin
    Kostadinov, Dimche
    Borth, Damian
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021, 34
  • [23] Supervised Learning for Convolutional Neural Network with Barlow Twins
    Murugan, Ramyaa
    Mojoo, Jonathan
    Kurita, Takio
    ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING - ICANN 2022, PT IV, 2022, 13532 : 484 - 495
  • [24] A dynamic growing neural network for supervised or unsupervised learning
    Tian, Daxin
    Liu, Yanheng
    Wei, Da
    WCICA 2006: SIXTH WORLD CONGRESS ON INTELLIGENT CONTROL AND AUTOMATION, VOLS 1-12, CONFERENCE PROCEEDINGS, 2006, : 2886 - 2890
  • [25] Modeling of Supervised ADALINE Neural Network Learning Technique
    Pellakuri, Vidyullatha
    Rao, D. Rajeswara
    Murhty, J. V. R.
    PROCEEDINGS OF THE 2016 2ND INTERNATIONAL CONFERENCE ON CONTEMPORARY COMPUTING AND INFORMATICS (IC3I), 2016, : 17 - 22
  • [26] A Neural Network for Semi-supervised Learning on Manifolds
    Genkin, Alexander
    Sengupta, Anirvan M.
    Chklovskii, Dmitri
    ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING - ICANN 2019: THEORETICAL NEURAL COMPUTATION, PT I, 2019, 11727 : 375 - 386
  • [27] Supervised Learning in a Multilayer, Nonlinear Chemical Neural Network
    Arredondo, David
    Lakin, Matthew R.
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2023, 34 (10) : 7734 - 7745
  • [28] Supervised incremental learning with the fuzzy ARTMAP neural network
    Connolly, Jean-Francois
    Granger, Eric
    Sabourin, Robert
    ARTIFICIAL NEURAL NETWORKS IN PATTERN RECOGNITION, PROCEEDINGS, 2008, 5064 : 66 - 77
  • [29] Masked convolutional neural network for supervised learning problems
    Liu, Leo Yu-Feng
    Liu, Yufeng
    Zhu, Hongtu
    STAT, 2020, 9 (01):
  • [30] Random neural network texture model
    Gelenbe, E
    Hussain, K
    Abdelbaki, H
    APPLICATIONS OF ARTIFICIAL NEURAL NETWORKS IN IMAGE PROCESSING V, 2000, 3962 : 104 - 111