On global convergence of ResNets: From finite to infinite width using linear parameterization

被引:0
|
作者
Barboni, Raphael [1 ]
Peyre, Gabriel [2 ,3 ]
Vialard, Francois-Xavier [4 ]
机构
[1] PSL Univ, ENS, Paris, France
[2] PSL Univ, CNRS, Paris, France
[3] PSL Univ, ENS, Paris, France
[4] Univ Gustave Eiffel, CNRS, LIGM, Champs Sur Marne, France
基金
欧洲研究理事会;
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Overparameterization is a key factor in the absence of convexity to explain global convergence of gradient descent (GD) for neural networks. Beside the well studied lazy regime, infinite width (mean field) analysis has been developed for shallow networks, using convex optimization techniques. To bridge the gap between the lazy and mean field regimes, we study Residual Networks (ResNets) in which the residual block has linear parameterization while still being nonlinear. Such ResNets admit both infinite depth and width limits, encoding residual blocks in a Reproducing Kernel Hilbert Space (RKHS). In this limit, we prove a local Polyak-Lojasiewicz inequality. Thus, every critical point is a global minimizer and a local convergence result of GD holds, retrieving the lazy regime. In contrast with other mean-field studies, it applies to both parametric and non-parametric cases under an expressivity condition on the residuals. Our analysis leads to a practical and quantified recipe: starting from a universal RKHS, Random Fourier Features are applied to obtain a finite dimensional parameterization satisfying with high-probability our expressivity condition.
引用
收藏
页数:13
相关论文
共 50 条
  • [1] Infinite dimensional general linear groups have finite width
    V. A. Tolstykh
    Siberian Mathematical Journal, 2006, 47 : 950 - 954
  • [2] Infinite dimensional general linear groups have finite width
    Tolstykh, V. A.
    SIBERIAN MATHEMATICAL JOURNAL, 2006, 47 (05) : 950 - 954
  • [3] Markov chain convergence: From finite to infinite
    Rosenthal, JS
    STOCHASTIC PROCESSES AND THEIR APPLICATIONS, 1996, 62 (01) : 55 - 72
  • [4] Global exponential stability and global convergence in finite time of delayed neural networks with infinite gain
    Forti, M
    Nistri, P
    Papini, D
    IEEE TRANSACTIONS ON NEURAL NETWORKS, 2005, 16 (06): : 1449 - 1463
  • [5] Linear Convergence of Gradient Descent for Finite Width Over-parametrized Linear Networks with General Initialization
    Xu, Ziqing
    Min, Hancheng
    Tarmoun, Salma
    Mallada, Enrique
    Vidal, Rene
    INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 206, 2023, 206
  • [6] Unitary circuits of finite depth and infinite width from quantum channels
    Gopalakrishnan, Sarang
    Lamacraft, Austen
    PHYSICAL REVIEW B, 2019, 100 (06)
  • [7] Linear Temporal Logic - From Infinite to Finite Horizon
    Tabajara, Lucas M.
    Vardi, Moshe Y.
    AUTOMATED TECHNOLOGY FOR VERIFICATION AND ANALYSIS, ATVA 2021, 2021, 12971 : 3 - 12
  • [8] An Improvement of Convergence in Finite Element Analysis With Infinite Element Using Deflation
    Ito, Hiroki
    Watanabe, Kota
    Igarashi, Hajime
    IEEE TRANSACTIONS ON MAGNETICS, 2012, 48 (02) : 667 - 670
  • [9] Valuation of variable annuity portfolios using finite and infinite width neural networks
    Lim, Hong Beng
    Shyamalkumar, Nariankadu D.
    Tao, Siyang
    INSURANCE MATHEMATICS & ECONOMICS, 2025, 120 : 269 - 284
  • [10] Simulation of stratospheric ozone in global forecast model using linear photochemistry parameterization
    Gill-Ran Jeong
    Beatriz M. Monge-Sanz
    Eun-Hee Lee
    Jerald R. Ziemke
    Asia-Pacific Journal of Atmospheric Sciences, 2016, 52 : 479 - 494