Using quasirandom weights in neural networks

被引:0
|
作者
Anderson, PG [1 ]
Ge, M [1 ]
Raghavendra, S [1 ]
Lung, ML [1 ]
机构
[1] Rochester Inst Technol, Dept Comp Sci & Imaging Sci, Rochester, NY 14623 USA
来源
关键词
neural networks; hidden layers; quasirandom weights;
D O I
暂无
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
We present a novel training algorithm for a feed forward neural network with a single hidden layer of nodes (i.e., two layers of connection weights). Our algorithm is capable of training networks for hard problems, such as the classic two-spirals problem. The weights in the first layer are determined using a quasirandom number generator. These weights are frozen-they are never modified during the training process. The second layer of weights is trained as a simple linear discriminator using methods such as the pseudoinverse, with possible iterations. We also study the problem of reducing the hidden layer: pruning low-weight nodes and a genetic algorithm search for good subsets.
引用
收藏
页码:61 / 71
页数:11
相关论文
共 50 条
  • [21] Next Item Prediction Using Neural Networks with Embedding Initialized Weights
    Yildiz, Cagri Emre
    Aker, Mustafa
    Yaslan, Yusuf
    IEEE EUROCON 2021 - 19TH INTERNATIONAL CONFERENCE ON SMART TECHNOLOGIES, 2021, : 338 - 342
  • [22] Optimizing connection weights in neural networks using the whale optimization algorithm
    Ibrahim Aljarah
    Hossam Faris
    Seyedali Mirjalili
    Soft Computing, 2018, 22 : 1 - 15
  • [23] Optimizing Connection Weights in Neural Networks Using Hybrid Metaheuristics Algorithms
    Bousmaha, Rabab
    Hamou, Reda Mohamed
    Amine, Abdelmalek
    INTERNATIONAL JOURNAL OF INFORMATION RETRIEVAL RESEARCH, 2022, 12 (01)
  • [24] Time Series Prediction Using Random Weights Fuzzy Neural Networks
    Rosato, Antonello
    Panella, Massimo
    2020 IEEE INTERNATIONAL CONFERENCE ON FUZZY SYSTEMS (FUZZ-IEEE), 2020,
  • [25] Overfitting measurement of convolutional neural networks using trained network weights
    Watanabe, Satoru
    Yamana, Hayato
    INTERNATIONAL JOURNAL OF DATA SCIENCE AND ANALYTICS, 2022, 14 (03) : 261 - 278
  • [26] Quantized Neural Networks: Training Neural Networks with Low Precision Weights and Activations
    Hubara, Itay
    Courbariaux, Matthieu
    Soudry, Daniel
    El-Yaniv, Ran
    Bengio, Yoshua
    JOURNAL OF MACHINE LEARNING RESEARCH, 2018, 18
  • [27] Idea of Using Blockchain Technique for Choosing the Best Configuration of Weights in Neural Networks
    Winnicka, Alicja
    Kesik, Karolina
    ALGORITHMS, 2019, 12 (08)
  • [28] Impact of random weights on nonlinear system identification using convolutional neural networks
    Yu, Wen
    Pacheco, Mario
    INFORMATION SCIENCES, 2019, 477 : 1 - 14
  • [29] Smart Pruning of Deep Neural Networks Using Curve Fitting and Evolution of Weights
    Islam, Ashhadul
    Belhaouari, Samir Brahim
    MACHINE LEARNING, OPTIMIZATION, AND DATA SCIENCE, LOD 2022, PT II, 2023, 13811 : 62 - 76
  • [30] A probabilistic learning algorithm for robust modeling using neural networks with random weights
    Cao, Feilong
    Ye, Hailiang
    Wang, Dianhui
    INFORMATION SCIENCES, 2015, 313 : 62 - 78