Continuous limits of residual neural networks in case of large input data

被引:0
|
作者
Herty, Michael [1 ]
Thuenen, Anna [2 ]
Trimborn, Torsten [3 ]
Visconti, Giuseppe [4 ]
机构
[1] Rhein Westfal TH Aachen, Inst Geometrie & Prakt Math IGPM, Templergraben 55, D-52062 Aachen, Germany
[2] Tech Univ Clausthal, Inst Math, Erzstr 1, D-38678 Clausthal Zellerfeld, Germany
[3] NRW BANK, Kavalleriestr 22, D-40213 Dusseldorf, Germany
[4] Sapienza Univ Rome, Dept Math G Castelnuovo, Ple Aldo Moro 5, I-00185 Rome, Italy
关键词
Neural networks; mean-field limit; well-posedness; optimal control; controllability; LEARNING FRAMEWORK; CONVERGENCE;
D O I
10.2478/caim-2022-0008
中图分类号
O1 [数学];
学科分类号
0701 ; 070101 ;
摘要
Residual deep neural networks (ResNets) are mathematically described as interacting particle systems. In the case of infinitely many layers the ResNet leads to a system of coupled system of ordinary differential equations known as neural differential equations. For large scale input data we derive a mean-field limit and show well-posedness of the resulting description. Further, we analyze the existence of solutions to the training process by using both a controllability and an optimal control point of view. Numerical investigations based on the solution of a formal optimality system illustrate the theoretical findings.
引用
收藏
页码:96 / 120
页数:25
相关论文
共 50 条
  • [11] Continuous attractors of a class of neural networks with a large number of neurons
    Xu, Fang
    Yi, Zhang
    COMPUTERS & MATHEMATICS WITH APPLICATIONS, 2011, 62 (10) : 3785 - 3795
  • [12] Verifying Binary Neural Networks on Continuous Input Space using Star Reachability
    Ivashchenko, Mykhailo
    Choi, Sung Woo
    Nguyen, Luan Viet
    Tran, Hoang-Dung
    2023 IEEE/ACM 11TH INTERNATIONAL CONFERENCE ON FORMAL METHODS IN SOFTWARE ENGINEERING, FORMALISE, 2023, : 7 - 17
  • [13] Poincare mapping of continuous recurrent neural networks excited by temporal external input
    Sato, S
    Gohara, K
    INTERNATIONAL JOURNAL OF BIFURCATION AND CHAOS, 2000, 10 (07): : 1677 - 1695
  • [14] Windowed electroencephalographic signal classifier based on continuous neural networks with delays in the input
    Alfaro-Ponce, M.
    Arguelles, A.
    Chairez, I.
    EXPERT SYSTEMS WITH APPLICATIONS, 2017, 68 : 1 - 10
  • [15] Input-output-to-state Stabilization for Continuous-time Neural Networks
    Ahn, Choon Ki
    INFORMATION, COMMUNICATION AND EDUCATION APPLICATION, VOL 11, 2013, 11 : 127 - 130
  • [16] Applying Multiple Neural Networks on Large Scale Data
    Boonkiatpong, Kritsanatt
    Sinthupinyo, Sukree
    INFORMATION AND ELECTRONICS ENGINEERING, 2011, 6 : 189 - 193
  • [17] Data-Driven Certification of Neural Networks With Random Input Noise
    Anderson, Brendon G.
    Sojoudi, Somayeh
    IEEE TRANSACTIONS ON CONTROL OF NETWORK SYSTEMS, 2023, 10 (01): : 249 - 260
  • [18] Noise signal as input data in self-organized neural networks
    Kagalovsky, V.
    Nemirovsky, D.
    Kravchenko, S. V.
    LOW TEMPERATURE PHYSICS, 2022, 48 (06) : 452 - 458
  • [19] Artificial neural networks in classification of NIR spectral data: Selection of the input
    Wu, W
    Massart, DL
    CHEMOMETRICS AND INTELLIGENT LABORATORY SYSTEMS, 1996, 35 (01) : 127 - 135
  • [20] Refining Constructive Neural Networks Using Functionally Expanded Input Data
    Bertini Junior, Joao Roberto
    Nicoletti, Maria do Carmo
    2015 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2015,