Selective Dropout for Deep Neural Networks

被引:5
|
作者
Barrow, Erik [1 ]
Eastwood, Mark [1 ]
Jayne, Chrisina [2 ]
机构
[1] Coventry Univ, Coventry, W Midlands, England
[2] Robert Gordon Univ, Aberdeen, Scotland
关键词
MNIST; Artificial neural network; Deep learning; Dropout network; Non-random dropout; Selective dropout;
D O I
10.1007/978-3-319-46675-0_57
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Dropout has been proven to be an effective method for reducing overfitting in deep artificial neural networks. We present 3 new alternative methods for performing dropout on a deep neural network which improves the effectiveness of the dropout method over the same training period. These methods select neurons to be dropped through statistical values calculated using a neurons change in weight, the average size of a neuron's weights, and the output variance of a neuron. We found that increasing the probability of dropping neurons with smaller values of these statistics and decreasing the probability of those with larger statistics gave an improved result in training over 10,000 epochs. The most effective of these was found to be the Output Variance method, giving an average improvement of 1.17% accuracy over traditional dropout methods.
引用
收藏
页码:519 / 528
页数:10
相关论文
共 50 条
  • [41] GUIDE: Training Deep Graph Neural Networks via Guided Dropout Over Edges
    Wang, Jie
    Liang, Jianqing
    Liang, Jiye
    Yao, Kaixuan
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2024, 35 (04) : 4465 - 4477
  • [42] Random image frequency aggregation dropout in image classification for deep convolutional neural networks
    Nam, Ju-Hyeon
    Lee, Sang-Chul
    COMPUTER VISION AND IMAGE UNDERSTANDING, 2023, 232
  • [43] Reliable Prediction Errors for Deep Neural Networks Using Test-Time Dropout
    Cortes-Ciriano, Isidro
    Bender, Andreas
    JOURNAL OF CHEMICAL INFORMATION AND MODELING, 2019, 59 (07) : 3330 - 3339
  • [44] Controlled Dropout: a Different Approach to Using Dropout on Deep Neural Network
    Ko, ByungSoo
    Kim, Han-Gyu
    Oh, Kyo-Joong
    Choi, Ho-Jin
    2017 IEEE INTERNATIONAL CONFERENCE ON BIG DATA AND SMART COMPUTING (BIGCOMP), 2017, : 358 - 362
  • [45] Checkerboard Dropout: A Structured Dropout With Checkerboard Pattern for Convolutional Neural Networks
    Nguyen, Khanh-Binh
    Choi, Jaehyuk
    Yang, Joon-Sung
    IEEE ACCESS, 2022, 10 : 76044 - 76054
  • [46] Controlled Dropout: a Different Dropout for Improving Training Speed on Deep Neural Network
    Ko, ByungSoo
    Kim, Han-Gyu
    Choi, Ho-Jin
    2017 IEEE INTERNATIONAL CONFERENCE ON SYSTEMS, MAN, AND CYBERNETICS (SMC), 2017, : 972 - 977
  • [47] Towards dropout training for convolutional neural networks
    Wu, Haibing
    Gu, Xiaodong
    NEURAL NETWORKS, 2015, 71 : 1 - 10
  • [48] Augmenting Recurrent Neural Networks Resilience by Dropout
    Bacciu, Davide
    Crecchi, Francesco
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2020, 31 (01) : 345 - 351
  • [49] Analysis on the Dropout Effect in Convolutional Neural Networks
    Park, Sungheon
    Kwak, Nojun
    COMPUTER VISION - ACCV 2016, PT II, 2017, 10112 : 189 - 204
  • [50] A General Approach to Dropout in Quantum Neural Networks
    Scala, Francesco
    Ceschini, Andrea
    Panella, Massimo
    Gerace, Dario
    ADVANCED QUANTUM TECHNOLOGIES, 2023,