Asymptotic Convergence Rate of Dropout on Shallow Linear Neural Networks

被引:1
|
作者
Senen-Cerda, Albert [1 ]
Sanders, Jaron [1 ]
机构
[1] Eindhoven Univ Technol, Eindhoven, Netherlands
关键词
Dropout; neural networks; convergence rate; gradient flow;
D O I
10.1145/3530898
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
0812 ;
摘要
We analyze the convergence rate of gradient flows on objective functions induced by Dropout and Dropconnect, when applying them to shallow linear Neural Networks (NNs)-which can also be viewed as doing matrix factorization using a particular regularizer. Dropout algorithms such as these are thus regularization techniques that use {0, 1}-valued random variables to filter weights during training in order to avoid coadaptation of features. By leveraging a recent result on nonconvex optimization and conducting a careful analysis of the set of minimizers as well as the Hessian of the loss function, we are able to obtain (i) a local convergence proof of the gradient flow and (ii) a bound on the convergence rate that depends on the data, the dropout probability, and the width of the NN. Finally, we compare this theoretical bound to numerical simulations, which are in qualitative agreement with the convergence bound and match it when starting sufficiently close to a minimizer.
引用
收藏
页数:53
相关论文
共 50 条
  • [21] Robustness of convergence in finite time for linear programming neural networks
    Di Marco, M
    Forti, M
    Grazzini, M
    INTERNATIONAL JOURNAL OF CIRCUIT THEORY AND APPLICATIONS, 2006, 34 (03) : 307 - 316
  • [22] Convergence Analysis for Learning Orthonormal Deep Linear Neural Networks
    Qin, Zhen
    Tan, Xuwei
    Zhu, Zhihui
    IEEE SIGNAL PROCESSING LETTERS, 2024, 31 : 795 - 799
  • [23] Convergence analysis of neural networks that solve linear programming problems
    Ferreira, LV
    Kaszkurewicz, E
    Bhaya, A
    PROCEEDING OF THE 2002 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, VOLS 1-3, 2002, : 2476 - 2481
  • [24] On the rate of convergence of image classifiers based on convolutional neural networks
    Kohler, Michael
    Krzyzak, Adam
    Walter, Benjamin
    ANNALS OF THE INSTITUTE OF STATISTICAL MATHEMATICS, 2022, 74 (06) : 1085 - 1108
  • [25] Estimate of exponential convergence rate and exponential stability for neural networks
    Department of Computer Science and Engineering, Chinese University of Hong Kong, Hong Kong, Hong Kong
    IEEE Trans Neural Networks, 6 (1487-1493):
  • [26] Estimate of exponential convergence rate and exponential stability for neural networks
    Yi, Z
    Heng, PA
    Fu, AWC
    IEEE TRANSACTIONS ON NEURAL NETWORKS, 1999, 10 (06): : 1487 - 1493
  • [27] On the rate of convergence of image classifiers based on convolutional neural networks
    Michael Kohler
    Adam Krzyżak
    Benjamin Walter
    Annals of the Institute of Statistical Mathematics, 2022, 74 : 1085 - 1108
  • [28] Toward moderate overparameterization: Global convergence guarantees for training shallow neural networks
    Oymak S.
    Soltanolkotabi M.
    Soltanolkotabi, Mahdi (msoltoon@gmail.com), 1600, Institute of Electrical and Electronics Engineers Inc. (01): : 84 - 105
  • [29] The Convergence Rate of Neural Networks for Learned Functions of Different Frequencies
    Basri, Ronen
    Jacobs, David
    Kasten, Yoni
    Kritchman, Shira
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 32 (NIPS 2019), 2019, 32
  • [30] Universal Approximation in Dropout Neural Networks
    Manita, Oxana A.
    Peletier, Mark A.
    Portegies, Jacobus W.
    Sanders, Jaron
    Senen-Cerda, Albert
    JOURNAL OF MACHINE LEARNING RESEARCH, 2022, 23