Convergence analysis of deep residual networks

被引:2
|
作者
Huang, Wentao [1 ]
Zhang, Haizhang [1 ]
机构
[1] Sun Yat Sen Univ, Sch Math Zhuhai, Zhuhai 519082, Guangdong, Peoples R China
基金
中国国家自然科学基金;
关键词
Deep learning; deep residual networks; ReLU networks; convolutional neural networks; convergence; RELU NETWORKS; ERROR-BOUNDS; APPROXIMATION; WIDTH;
D O I
10.1142/S021953052350029X
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
Various powerful deep neural network architectures have made great contributions to the exciting successes of deep learning in the past two decades. Among them, deep Residual Networks (ResNets) are of particular importance because they demonstrated great usefulness in computer vision by winning the first place in many deep learning competitions. Also, ResNets are the first class of neural networks in the development history of deep learning that are really deep. It is of mathematical interest and practical meaning to understand the convergence of deep ResNets. We aim at studying the convergence of deep ResNets as the depth tends to infinity in terms of the parameters of the networks. Toward this purpose, we first give a matrix-vector description of general deep neural networks with shortcut connections and formulate an explicit expression for the networks by using the notion of activation matrices. The convergence is then reduced to the convergence of two series involving infinite products of non-square matrices. By studying the two series, we establish a sufficient condition for pointwise convergence of ResNets. We also conduct experiments on benchmark machine learning data to illustrate the potential usefulness of the results.
引用
收藏
页码:351 / 382
页数:32
相关论文
共 50 条
  • [1] Global Convergence of Gradient Descent for Deep Linear Residual Networks
    Wu, Lei
    Wang, Qingcan
    Ma, Chao
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 32 (NIPS 2019), 2019, 32
  • [2] Visualizing Apparent Personality Analysis with Deep Residual Networks
    Gucluturk, Yagmur
    Guclu, Umut
    Perez, Marc
    Jair Escalante, Hugo
    Baro, Xavier
    Guyon, Isabelle
    Andujar, Carlos
    Jacques, Julio, Jr.
    Madadi, Meysam
    Escalera, Sergio
    van Gerven, Marcel A. J.
    van Lier, Rob
    2017 IEEE INTERNATIONAL CONFERENCE ON COMPUTER VISION WORKSHOPS (ICCVW 2017), 2017, : 3101 - 3109
  • [3] Attention deep residual networks for MR image analysis
    Mengqing Mei
    Fazhi He
    Shan Xue
    Neural Computing and Applications, 2023, 35 : 12957 - 12966
  • [4] Attention deep residual networks for MR image analysis
    Mei, Mengqing
    He, Fazhi
    Xue, Shan
    NEURAL COMPUTING & APPLICATIONS, 2023, 35 (18): : 12957 - 12966
  • [5] Multimodal First Impression Analysis with Deep Residual Networks
    Gucluturk, Yagmur
    Guclu, Umut
    Baro, Xavier
    Escalante, Hugo Jair
    Guyon, Isabelle
    Escalera, Sergio
    van Gerven, Marcel A. J.
    van Lier, Rob
    IEEE TRANSACTIONS ON AFFECTIVE COMPUTING, 2018, 9 (03) : 316 - 329
  • [6] Wide deep residual networks in networks
    Hmidi Alaeddine
    Malek Jihene
    Multimedia Tools and Applications, 2023, 82 : 7889 - 7899
  • [7] Wide deep residual networks in networks
    Alaeddine, Hmidi
    Jihene, Malek
    MULTIMEDIA TOOLS AND APPLICATIONS, 2023, 82 (05) : 7889 - 7899
  • [8] Convergence of deep ReLU networks
    Xu, Yuesheng
    Zhang, Haizhang
    NEUROCOMPUTING, 2024, 571
  • [9] On Non-local Convergence Analysis of Deep Linear Networks
    Chen, Kun
    Lin, Dachao
    Zhang, Zhihua
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 162, 2022,
  • [10] Convergence Analysis for Learning Orthonormal Deep Linear Neural Networks
    Qin, Zhen
    Tan, Xuwei
    Zhu, Zhihui
    IEEE SIGNAL PROCESSING LETTERS, 2024, 31 : 795 - 799