Selective Dropout for Deep Neural Networks

被引:5
|
作者
Barrow, Erik [1 ]
Eastwood, Mark [1 ]
Jayne, Chrisina [2 ]
机构
[1] Coventry Univ, Coventry, W Midlands, England
[2] Robert Gordon Univ, Aberdeen, Scotland
关键词
MNIST; Artificial neural network; Deep learning; Dropout network; Non-random dropout; Selective dropout;
D O I
10.1007/978-3-319-46675-0_57
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Dropout has been proven to be an effective method for reducing overfitting in deep artificial neural networks. We present 3 new alternative methods for performing dropout on a deep neural network which improves the effectiveness of the dropout method over the same training period. These methods select neurons to be dropped through statistical values calculated using a neurons change in weight, the average size of a neuron's weights, and the output variance of a neuron. We found that increasing the probability of dropping neurons with smaller values of these statistics and decreasing the probability of those with larger statistics gave an improved result in training over 10,000 epochs. The most effective of these was found to be the Output Variance method, giving an average improvement of 1.17% accuracy over traditional dropout methods.
引用
收藏
页码:519 / 528
页数:10
相关论文
共 50 条
  • [31] ANNEALED DROPOUT TRAINING OF DEEP NETWORKS
    Rennie, Steven J.
    Goel, Vaibhava
    Thomas, Samuel
    2014 IEEE WORKSHOP ON SPOKEN LANGUAGE TECHNOLOGY SLT 2014, 2014, : 159 - 164
  • [32] A Dropout Distribution Model on Deep Networks
    Li, Fengqi
    Yang, Helin
    EIGHTH INTERNATIONAL CONFERENCE ON DIGITAL IMAGE PROCESSING (ICDIP 2016), 2016, 10033
  • [33] Surprising properties of dropout in deep networks
    Helmbold, David P.
    Long, Philip M.
    JOURNAL OF MACHINE LEARNING RESEARCH, 2018, 18
  • [34] Universal Approximation in Dropout Neural Networks
    Manita, Oxana A.
    Peletier, Mark A.
    Portegies, Jacobus W.
    Sanders, Jaron
    Senen-Cerda, Albert
    JOURNAL OF MACHINE LEARNING RESEARCH, 2022, 23
  • [35] Selective Hardening of Critical Neurons in Deep Neural Networks
    Ruospo, Annachiara
    Gavarini, Gabriele
    Bragaglia, Ilaria
    Traiola, Marcello
    Bosio, Alberto
    Sanchez, Ernesto
    2022 25TH INTERNATIONAL SYMPOSIUM ON DESIGN AND DIAGNOSTICS OF ELECTRONIC CIRCUITS AND SYSTEMS (DDECS), 2022, : 136 - 141
  • [36] Data Selective Deep Neural Networks For Image Classification
    Mendonca, Marcele O. K.
    Ferreira, Jonathas O.
    Diniz, Paulo S. R.
    29TH EUROPEAN SIGNAL PROCESSING CONFERENCE (EUSIPCO 2021), 2021, : 1376 - 1380
  • [37] Selective BTEX Measurements Using Deep Neural Networks
    Mhanna, Mhanna
    Sy, Mohamed
    Farooq, Aamir
    2021 CONFERENCE ON LASERS AND ELECTRO-OPTICS (CLEO), 2021,
  • [38] Adversarial Dropout for Recurrent Neural Networks
    Park, Sungrae
    Song, Kyungwoo
    Ji, Mingi
    Lee, Wonsung
    Moon, Il-Chul
    THIRTY-THIRD AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE / THIRTY-FIRST INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE / NINTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2019, : 4699 - 4706
  • [39] Understanding Dropout for Graph Neural Networks
    Shu, Juan
    Xi, Bowei
    Li, Yu
    Wu, Fan
    Kamhoua, Charles
    Ma, Jianzhu
    COMPANION PROCEEDINGS OF THE WEB CONFERENCE 2022, WWW 2022 COMPANION, 2022, : 1128 - 1138
  • [40] Dropout Algorithms for Recurrent Neural Networks
    Watt, Nathan
    du Plessis, Mathys C.
    PROCEEDINGS OF THE ANNUAL CONFERENCE OF THE SOUTH AFRICAN INSTITUTE OF COMPUTER SCIENTISTS AND INFORMATION TECHNOLOGISTS (SAICSIT 2018), 2018, : 72 - 78