A novel method to compute the weights of neural networks

被引:10
|
作者
Gao, Zhentao [1 ]
Chen, Yuanyuan [1 ]
Yi, Zhang [1 ]
机构
[1] Sichuan Univ, Coll Comp Sci, Machine Intelligence Lab, Chengdu 610065, Peoples R China
基金
中国国家自然科学基金;
关键词
Neural networks; Gradient free; Closed-form solution; White box models;
D O I
10.1016/j.neucom.2020.03.114
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Neural networks are the main strength of modern artificial intelligence; they have demonstrated revolu-tionary performance in a wide range of applications. In practice, the weights of neural networks are gen-erally obtained indirectly using iterative training methods. Such methods are inefficient and problematic in many respects. Besides, neural networks trained end-to-end by such methods are typical black box models that are hard to interpret. Thus, it would be significantly better if the weights of a neural network could be calculated directly. In this paper, we located the key for calculating the weights of a neural net-work directly: assigning proper targets to the hidden units. Furthermore, if such targets are assigned, the neural network becomes a white box model that is easy to interpret. Thus, we propose a framework for solving the weights of a neural network and provide a sample implementation of the framework. The implementation was tested in various classification and regression experiments. Compared with neural networks trained using traditional methods, the constructed ones using solved weights had similar or better performance on many tasks, while remaining interpretable. Given the early stage of the proposed approach, many improvements are expectable in future developments. (C) 2020 Elsevier B.V. All rights reserved.
引用
收藏
页码:409 / 427
页数:19
相关论文
共 50 条
  • [31] Routes to chaos in neural networks with random weights
    Albers, DJ
    Sprott, JC
    Dechert, WD
    INTERNATIONAL JOURNAL OF BIFURCATION AND CHAOS, 1998, 8 (07): : 1463 - 1478
  • [32] Learning Neural Networks without Lazy Weights
    Lee, Dong-gi
    Cho, Junhee
    Kim, Myungjun
    Park, Sunghong
    Shin, Hyunjung
    2022 IEEE INTERNATIONAL CONFERENCE ON BIG DATA AND SMART COMPUTING (IEEE BIGCOMP 2022), 2022, : 82 - 87
  • [33] Determination of weights for relaxation recurrent neural networks
    Serpen, G
    Livingston, DL
    NEUROCOMPUTING, 2000, 34 : 145 - 168
  • [34] A simultaneous learning method for both activation functions and connection weights of multilayer neural networks
    Nakayama, K
    Ohsugi, M
    IEEE WORLD CONGRESS ON COMPUTATIONAL INTELLIGENCE, 1998, : 2253 - 2257
  • [35] Neural Networks Between Integer and Rational Weights
    Sima, Jiri
    2017 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2017, : 154 - 161
  • [36] Hardness of Learning Neural Networks with Natural Weights
    Daniely, Amit
    Vardi, Gal
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 33, NEURIPS 2020, 2020, 33
  • [37] Logic circuits of variable function using neural networks and their method of calculating synaptic weights
    Teramura, M
    Nomiyama, T
    Miyazaki, T
    ELECTRONICS AND COMMUNICATIONS IN JAPAN PART III-FUNDAMENTAL ELECTRONIC SCIENCE, 2000, 83 (01): : 14 - 20
  • [38] Logic circuits of variable function using neural networks and their method of calculating synaptic weights
    Teramura, Masahiro
    Nomiyama, Teruaki
    Miyazaki, Tomoyuki
    2000, Scripta Technica Inc, New York (83):
  • [39] A non-iterative method for pruning hidden neurons in neural networks with random weights
    Henriquez, Pablo A.
    Ruz, Gonzalo A.
    APPLIED SOFT COMPUTING, 2018, 70 : 1109 - 1121
  • [40] NEURAL NETWORKS WITH UNIPOLAR WEIGHTS AND NORMALIZED THRESHOLDS
    BRODKA, JS
    MACUKOW, B
    OPTICAL COMPUTING, 1995, 139 : 463 - 466