Convergence analysis of deep residual networks

被引:2
|
作者
Huang, Wentao [1 ]
Zhang, Haizhang [1 ]
机构
[1] Sun Yat Sen Univ, Sch Math Zhuhai, Zhuhai 519082, Guangdong, Peoples R China
基金
中国国家自然科学基金;
关键词
Deep learning; deep residual networks; ReLU networks; convolutional neural networks; convergence; RELU NETWORKS; ERROR-BOUNDS; APPROXIMATION; WIDTH;
D O I
10.1142/S021953052350029X
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
Various powerful deep neural network architectures have made great contributions to the exciting successes of deep learning in the past two decades. Among them, deep Residual Networks (ResNets) are of particular importance because they demonstrated great usefulness in computer vision by winning the first place in many deep learning competitions. Also, ResNets are the first class of neural networks in the development history of deep learning that are really deep. It is of mathematical interest and practical meaning to understand the convergence of deep ResNets. We aim at studying the convergence of deep ResNets as the depth tends to infinity in terms of the parameters of the networks. Toward this purpose, we first give a matrix-vector description of general deep neural networks with shortcut connections and formulate an explicit expression for the networks by using the notion of activation matrices. The convergence is then reduced to the convergence of two series involving infinite products of non-square matrices. By studying the two series, we establish a sufficient condition for pointwise convergence of ResNets. We also conduct experiments on benchmark machine learning data to illustrate the potential usefulness of the results.
引用
收藏
页码:351 / 382
页数:32
相关论文
共 50 条
  • [41] DEEP NEURAL NETWORKS ALGORITHMS FOR STOCHASTIC CONTROL PROBLEMS ON FINITE HORIZON: CONVERGENCE ANALYSIS
    Hure, Come
    Pham, Huyen
    Bachouch, Achref
    Langrene, Nicolas
    SIAM JOURNAL ON NUMERICAL ANALYSIS, 2021, 59 (01) : 525 - 557
  • [42] A Kinect-Based Workplace Postural Analysis System using Deep Residual Networks
    Abobakr, Ahmed
    Nahavandi, Darius
    Iskander, Julie
    Hossny, Mohammed
    Nahavandi, Saeid
    Smets, Marty
    2017 IEEE INTERNATIONAL SYMPOSIUM ON SYSTEMS ENGINEERING (ISSE 2017), 2017, : 137 - 142
  • [43] Analysis of Deep Networks with Residual Blocks and Different Activation Functions: Classification of Skin Diseases
    Goceri, Evgin
    2019 NINTH INTERNATIONAL CONFERENCE ON IMAGE PROCESSING THEORY, TOOLS AND APPLICATIONS (IPTA), 2019,
  • [44] Convergence rates of deep ReLU networks for multiclass classification
    Bos, Thijs
    Schmidt-Hieber, Johannes
    ELECTRONIC JOURNAL OF STATISTICS, 2022, 16 (01): : 2724 - 2773
  • [45] Fast convergence rates of deep neural networks for classification
    Kim, Yongdai
    Ohn, Ilsang
    Kim, Dongha
    NEURAL NETWORKS, 2021, 138 (138) : 179 - 197
  • [46] A Recipe for Global Convergence Guarantee in Deep Neural Networks
    Kawaguchi, Kenji
    Sun, Qingyun
    THIRTY-FIFTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THIRTY-THIRD CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE AND THE ELEVENTH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2021, 35 : 8074 - 8082
  • [47] Deep contextual recurrent residual networks for scene labeling
    Le, T. Hoang Ngan
    Chi Nhan Duong
    Han, Ligong
    Luu, Khoa
    Quach, Kha Gia
    Savvides, Marios
    PATTERN RECOGNITION, 2018, 80 : 32 - 41
  • [48] DRCDN: learning deep residual convolutional dehazing networks
    Shengdong Zhang
    Fazhi He
    The Visual Computer, 2020, 36 : 1797 - 1808
  • [49] GLOBALLY CONVERGENT MULTILEVEL TRAINING OF DEEP RESIDUAL NETWORKS
    Kopanicakova, Alena
    Krause, Rolf
    SIAM JOURNAL ON SCIENTIFIC COMPUTING, 2023, 45 (03): : S254 - S280
  • [50] Revolutionizing Image Recognition and Beyond with Deep Residual Networks
    Baraneedharan, P.
    Nithyasri, A.
    Keerthana, P.
    COMMUNICATION AND INTELLIGENT SYSTEMS, VOL 1, ICCIS 2023, 2024, 967 : 441 - 448