Solving the linear interval tolerance problem for weight initialization of neural networks

被引:29
|
作者
Adam, S. P. [1 ,2 ]
Karras, D. A. [3 ]
Magoulas, G. D. [4 ]
Vrahatis, M. N. [1 ]
机构
[1] Univ Patras, Dept Math, Computat Intelligence Lab, GR-26110 Patras, Greece
[2] Technol Educ Inst Epirus, Dept Comp Engn, Arta 47100, Greece
[3] Technol Educ Inst Sterea Hellas, Dept Automat, Psahna 34400, Evia, Greece
[4] Univ London, Birkbeck Coll, Dept Comp Sci & Informat Syst, London WC1E 7HX, England
关键词
Neural networks; Weight initialization; Interval analysis; Linear interval tolerance problem; FEEDFORWARD NETWORKS; STATISTICAL TESTS; TRAINING SPEED; HIGH-DIMENSION; BACKPROPAGATION; ALGORITHM; INTELLIGENCE;
D O I
10.1016/j.neunet.2014.02.006
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Determining good initial conditions for an algorithm used to train a neural network is considered a parameter estimation problem dealing with uncertainty about the initial weights. Interval analysis approaches model uncertainty in parameter estimation problems using intervals and formulating tolerance problems. Solving a tolerance problem is defining lower and upper bounds of the intervals so that the system functionality is guaranteed within predefined limits. The aim of this paper is to show how the problem of determining the initial weight intervals of a neural network can be defined in terms of solving a linear interval tolerance problem. The proposed linear interval tolerance approach copes with uncertainty about the initial weights without any previous knowledge or specific assumptions on the input data as required by approaches such as fuzzy sets or rough sets. The proposed method is tested on a number of well known benchmarks for neural networks trained with the back-propagation family of algorithms. Its efficiency is evaluated with regards to standard performance measures and the results obtained are compared against results of a number of well known and established initialization methods. These results provide credible evidence that the proposed method outperforms classical weight initialization methods. (C) 2014 Elsevier Ltd. All rights reserved.
引用
收藏
页码:17 / 37
页数:21
相关论文
共 50 条
  • [1] SOLVING THE LINEAR INTERVAL TOLERANCE PROBLEM
    SHARY, SP
    MATHEMATICS AND COMPUTERS IN SIMULATION, 1995, 39 (1-2) : 53 - 85
  • [2] An interval approach for weight's initialization of feedforward neural networks
    Jamett, Marcela
    Acuna, Gonzalo
    MICAI 2006: ADVANCES IN ARTIFICIAL INTELLIGENCE, PROCEEDINGS, 2006, 4293 : 305 - +
  • [3] A weight initialization based on the linear product structure for neural networks
    Chen, Qipin
    Hao, Wenrui
    He, Juncai
    APPLIED MATHEMATICS AND COMPUTATION, 2022, 415
  • [4] Interval Based Weight Initialization Method for Sigmoidal Feedforward Artificial Neural Networks
    Sodhi, Sartaj Singh
    Chandra, Pravin
    2ND AASRI CONFERENCE ON COMPUTATIONAL INTELLIGENCE AND BIOINFORMATICS, 2014, 6 : 19 - 25
  • [5] An Interval Linear Tolerance Problem
    S. P. Shary
    Automation and Remote Control, 2004, 65 : 1653 - 1666
  • [6] An interval linear tolerance problem
    Shary, SP
    AUTOMATION AND REMOTE CONTROL, 2004, 65 (10) : 1653 - 1666
  • [7] A review on weight initialization strategies for neural networks
    Meenal V. Narkhede
    Prashant P. Bartakke
    Mukul S. Sutaone
    Artificial Intelligence Review, 2022, 55 : 291 - 322
  • [8] A review on weight initialization strategies for neural networks
    Narkhede, Meenal V.
    Bartakke, Prashant P.
    Sutaone, Mukul S.
    ARTIFICIAL INTELLIGENCE REVIEW, 2022, 55 (01) : 291 - 322
  • [9] Initialising Deep Neural Networks: An Approach Based on Linear Interval Tolerance
    Stamate, Cosmin
    Magoulas, George D.
    Thomas, Michael S. C.
    PROCEEDINGS OF SAI INTELLIGENT SYSTEMS CONFERENCE (INTELLISYS) 2016, VOL 2, 2018, 16 : 477 - 485
  • [10] An overview on weight initialization methods for feedforward neural networks
    de Sousa, Celso A. R.
    2016 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2016, : 52 - 59