Selective Dropout for Deep Neural Networks

被引:5
|
作者
Barrow, Erik [1 ]
Eastwood, Mark [1 ]
Jayne, Chrisina [2 ]
机构
[1] Coventry Univ, Coventry, W Midlands, England
[2] Robert Gordon Univ, Aberdeen, Scotland
关键词
MNIST; Artificial neural network; Deep learning; Dropout network; Non-random dropout; Selective dropout;
D O I
10.1007/978-3-319-46675-0_57
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Dropout has been proven to be an effective method for reducing overfitting in deep artificial neural networks. We present 3 new alternative methods for performing dropout on a deep neural network which improves the effectiveness of the dropout method over the same training period. These methods select neurons to be dropped through statistical values calculated using a neurons change in weight, the average size of a neuron's weights, and the output variance of a neuron. We found that increasing the probability of dropping neurons with smaller values of these statistics and decreasing the probability of those with larger statistics gave an improved result in training over 10,000 epochs. The most effective of these was found to be the Output Variance method, giving an average improvement of 1.17% accuracy over traditional dropout methods.
引用
收藏
页码:519 / 528
页数:10
相关论文
共 50 条
  • [1] Variational Dropout Sparsifies Deep Neural Networks
    Molchanov, Dmitry
    Ashukha, Arsenii
    Vetrov, Dmitry
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 70, 2017, 70
  • [2] Dropout Rademacher complexity of deep neural networks
    Wei GAO
    Zhi-Hua ZHOU
    ScienceChina(InformationSciences), 2016, 59 (07) : 173 - 184
  • [3] Regularization of deep neural networks with spectral dropout
    Khan, Salman H.
    Hayat, Munawar
    Porikli, Fatih
    NEURAL NETWORKS, 2019, 110 : 82 - 90
  • [4] Dropout Rademacher complexity of deep neural networks
    Wei Gao
    Zhi-Hua Zhou
    Science China Information Sciences, 2016, 59
  • [5] Dropout Rademacher complexity of deep neural networks
    Gao, Wei
    Zhou, Zhi-Hua
    SCIENCE CHINA-INFORMATION SCIENCES, 2016, 59 (07)
  • [6] Jumpout : Improved Dropout for Deep Neural Networks with ReLUs
    Wang, Shengjie
    Zhou, Tianyi
    Bilmes, Jeff A.
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 97, 2019, 97
  • [7] Excitation Dropout: Encouraging Plasticity in Deep Neural Networks
    Andrea Zunino
    Sarah Adel Bargal
    Pietro Morerio
    Jianming Zhang
    Stan Sclaroff
    Vittorio Murino
    International Journal of Computer Vision, 2021, 129 : 1139 - 1152
  • [8] Dropout with Tabu Strategy for Regularizing Deep Neural Networks
    Ma, Zongjie
    Sattar, Abdul
    Zhou, Jun
    Chen, Qingliang
    Su, Kaile
    COMPUTER JOURNAL, 2020, 63 (07): : 1031 - 1038
  • [9] Excitation Dropout: Encouraging Plasticity in Deep Neural Networks
    Zunino, Andrea
    Bargal, Sarah Adel
    Morerio, Pietro
    Zhang, Jianming
    Sclaroff, Stan
    Murino, Vittorio
    INTERNATIONAL JOURNAL OF COMPUTER VISION, 2021, 129 (04) : 1139 - 1152
  • [10] Deep Learning Convolutional Neural Networks with Dropout - a Parallel Approach
    Shen, Jingyi
    Shafiq, M. Omair
    2018 17TH IEEE INTERNATIONAL CONFERENCE ON MACHINE LEARNING AND APPLICATIONS (ICMLA), 2018, : 572 - 577