Regularizing Deep Neural Networks by Noise: Its Interpretation and Optimization

被引:0
|
作者
Noh, Hyeonwoo [1 ]
You, Tackgeun [1 ]
Mun, Jonghwan [1 ]
Han, Bohyung [1 ]
机构
[1] POSTECH, Dept Comp Sci & Engn, Pohang, South Korea
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Overfitting is one of the most critical challenges in deep neural networks, and there are various types of regularization methods to improve generalization performance. Injecting noises to hidden units during training, e.g., dropout, is known as a successful regularizer, but it is still not clear enough why such training techniques work well in practice and how we can maximize their benefit in the presence of two conflicting objectives-optimizing to true data distribution and preventing overfitting by regularization. This paper addresses the above issues by 1) interpreting that the conventional training methods with regularization by noise injection optimize the lower bound of the true objective and 2) proposing a technique to achieve a tighter lower bound using multiple noise samples per training example in a stochastic gradient descent iteration. We demonstrate the effectiveness of our idea in several computer vision applications.
引用
收藏
页数:10
相关论文
共 50 条
  • [21] Deep Neural Networks Optimization Based On Deconvolutional Networks
    Liu, Zhoufeng
    Zhang, Chi
    Li, Chunlei
    Ding, Shumin
    Liu, Shanliang
    Dong, Yan
    PROCEEDINGS OF THE 2ND INTERNATIONAL CONFERENCE ON GRAPHICS AND SIGNAL PROCESSING (ICGSP 2018), 2018, : 7 - 11
  • [22] Regularizing Deep Networks With Semantic Data Augmentation
    Wang, Yulin
    Huang, Gao
    Song, Shiji
    Pan, Xuran
    Xia, Yitong
    Wu, Cheng
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2022, 44 (07) : 3733 - 3748
  • [23] Interpretation of Deep Neural Networks Based on Decision Trees
    Ueno, Tsukasa
    Zhao, Qiangfu
    2018 16TH IEEE INT CONF ON DEPENDABLE, AUTONOM AND SECURE COMP, 16TH IEEE INT CONF ON PERVAS INTELLIGENCE AND COMP, 4TH IEEE INT CONF ON BIG DATA INTELLIGENCE AND COMP, 3RD IEEE CYBER SCI AND TECHNOL CONGRESS (DASC/PICOM/DATACOM/CYBERSCITECH), 2018, : 256 - 261
  • [24] Regularizing Neural Networks with Adaptive Local Drop
    Cao, Binbin
    Li, Jianmin
    Zhang, Bo
    2015 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2015,
  • [25] Graph neural networks for deep portfolio optimization
    Ekmekcioglu, Omer
    Pinar, Mustafa C.
    NEURAL COMPUTING & APPLICATIONS, 2023, 35 (28): : 20663 - 20674
  • [26] An Optimization Strategy for Deep Neural Networks Training
    Wu, Tingting
    Zeng, Peng
    Song, Chunhe
    2022 INTERNATIONAL CONFERENCE ON IMAGE PROCESSING, COMPUTER VISION AND MACHINE LEARNING (ICICML), 2022, : 596 - 603
  • [27] Graph neural networks for deep portfolio optimization
    Ömer Ekmekcioğlu
    Mustafa Ç. Pınar
    Neural Computing and Applications, 2023, 35 : 20663 - 20674
  • [28] A Study of Optimization in Deep Neural Networks for Regression
    Chen, Chieh-Huang
    Lai, Jung-Pin
    Chang, Yu-Ming
    Lai, Chi-Ju
    Pai, Ping-Feng
    ELECTRONICS, 2023, 12 (14)
  • [29] Regularizing deep neural networks for medical image analysis with augmented batch normalization[Formula presented]
    Zhu, Shengqian
    Yu, Chengrong
    Hu, Junjie
    Applied Soft Computing, 2024, 154
  • [30] Deep Neural Networks for Acoustic Modeling in the Presence of Noise
    Santana, L. M. Q. D.
    Santos, R. M.
    Matos, L. N.
    Macedo, H. T.
    IEEE LATIN AMERICA TRANSACTIONS, 2018, 16 (03) : 918 - 925