Convergence analysis of deep residual networks

被引:2
|
作者
Huang, Wentao [1 ]
Zhang, Haizhang [1 ]
机构
[1] Sun Yat Sen Univ, Sch Math Zhuhai, Zhuhai 519082, Guangdong, Peoples R China
基金
中国国家自然科学基金;
关键词
Deep learning; deep residual networks; ReLU networks; convolutional neural networks; convergence; RELU NETWORKS; ERROR-BOUNDS; APPROXIMATION; WIDTH;
D O I
10.1142/S021953052350029X
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
Various powerful deep neural network architectures have made great contributions to the exciting successes of deep learning in the past two decades. Among them, deep Residual Networks (ResNets) are of particular importance because they demonstrated great usefulness in computer vision by winning the first place in many deep learning competitions. Also, ResNets are the first class of neural networks in the development history of deep learning that are really deep. It is of mathematical interest and practical meaning to understand the convergence of deep ResNets. We aim at studying the convergence of deep ResNets as the depth tends to infinity in terms of the parameters of the networks. Toward this purpose, we first give a matrix-vector description of general deep neural networks with shortcut connections and formulate an explicit expression for the networks by using the notion of activation matrices. The convergence is then reduced to the convergence of two series involving infinite products of non-square matrices. By studying the two series, we establish a sufficient condition for pointwise convergence of ResNets. We also conduct experiments on benchmark machine learning data to illustrate the potential usefulness of the results.
引用
收藏
页码:351 / 382
页数:32
相关论文
共 50 条
  • [21] Deep limits of residual neural networks
    Thorpe, Matthew
    van Gennip, Yves
    RESEARCH IN THE MATHEMATICAL SCIENCES, 2023, 10 (01)
  • [22] Scaling Properties of Deep Residual Networks
    Cohen, Alain-Sam
    Cont, Rama
    Rossier, Alain
    Xu, Renyuan
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 139, 2021, 139
  • [23] Convergence analysis of the deep neural networks based globalized dual heuristic programming
    Kim, Jong Woo
    Oh, Tae Hoon
    Son, Sang Hwan
    Jeong, Dong Hwi
    Lee, Jong Min
    AUTOMATICA, 2020, 122
  • [24] On the Convergence and Sample Complexity Analysis of Deep Q-Networks with ∈-Greedy Exploration
    Zhang, Shuai
    Li, Hongkang
    Wang, Meng
    Liu, Miao
    Chen, Pin-Yu
    Lu, Songtao
    Liu, Sijia
    Murugesan, Keerthiram
    Chaudhury, Subhajit
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36 (NEURIPS 2023), 2023,
  • [25] Convergence Analysis of PSO for Hyper-Parameter Selection in Deep Neural Networks
    Nalepa, Jakub
    Lorenzo, Pablo Ribalta
    ADVANCES ON P2P, PARALLEL, GRID, CLOUD AND INTERNET COMPUTING (3PGCIC-2017), 2018, 13 : 284 - 295
  • [26] Analysis of Gradient Degradation and Feature Map Quality in Deep All-Convolutional Neural Networks Compared to Deep Residual Networks
    Gao, Wei
    McDonnell, Mark D.
    NEURAL INFORMATION PROCESSING (ICONIP 2017), PT II, 2017, 10635 : 612 - 621
  • [27] Semantic Segmentation with Modified Deep Residual Networks
    Chen, Xinze
    Cheng, Guangliang
    Cai, Yinghao
    Wen, Dayong
    Li, Heping
    PATTERN RECOGNITION (CCPR 2016), PT II, 2016, 663 : 42 - 54
  • [28] Pulmonary nodule classification with deep residual networks
    Nibali, Aiden
    He, Zhen
    Wollersheim, Dennis
    INTERNATIONAL JOURNAL OF COMPUTER ASSISTED RADIOLOGY AND SURGERY, 2017, 12 (10) : 1799 - 1808
  • [29] Detection of Anomalous Diffusion with Deep Residual Networks
    Gajowczyk, Milosz
    Szwabinski, Janusz
    ENTROPY, 2021, 23 (06)
  • [30] Expectile regression via deep residual networks
    Yin, Yiyi
    Zou, Hui
    STAT, 2021, 10 (01):