Modeling and qualitative analysis of continuous-time neural networks under pure structural variations

被引:2
|
作者
Grujic, LT [1 ]
Michel, AN [1 ]
机构
[1] UNIV NOTRE DAME, DEPT ELECT ENGN, NOTRE DAME, IN 46556 USA
基金
美国国家科学基金会;
关键词
Hopfield neural networks [1-4]w ere mainly modeled by assuming a time-invariant structure and used to study their qualitative dynamical properties [5; 6]. During the learning process a neural nework is subjected to (arbitrary or designed) structural variation [4; 7]. Dynamics of neural networks under designed structural variations was investigated in [8; a nd of those under unpredictable structural and/or parameter variations in [10].H owever; modeling of Hopfield neural networks under pure structural variations has not been carded out; which is done in what follows. Stability analysis of neural networks initiated by Hopfield [1 ] for a special class of neural networks was developed for any type of Hopfield neural network and extended to the qualitative analysis of their motions in forced regimes in [5; 11 ]. It was emphasized in [1; 9] that in neurobiology the structure of the circuit changes every time learning occurs and that a memory stored in an artificial neural network is achieved by an appropriate choice of conductances. Stability analysis of neural networks under structural variations was considered in [5; 7; and under both parameter and structural variations in [8]. In what follows; new results are obtained for exponential stability of x -----0 of Hopfield neural networks and for estimates of the domain of the exponential stability; which is also defined herein; under arbitrary t Supported in part by NSF Grant ECS 91-07728. * Corresponding author;
D O I
10.1016/0378-4754(95)00004-6
中图分类号
TP39 [计算机的应用];
学科分类号
081203 ; 0835 ;
摘要
A qualitative analysis is developed for continuous-time neural networks subjected to random pure structural variations. Simple algebraic conditions are established for both structural exponential stability of x = 0 of the neural network and for estimates of its domain of attraction. Bounds on motions of the neural network in a forced regime are provided. They do not require any information about its actual structure, which can be completely unknown and may vary unpredictably.
引用
收藏
页码:523 / 533
页数:11
相关论文
共 50 条
  • [21] Closed-form continuous-time neural networks
    Ramin Hasani
    Mathias Lechner
    Alexander Amini
    Lucas Liebenwein
    Aaron Ray
    Max Tschaikowski
    Gerald Teschl
    Daniela Rus
    Nature Machine Intelligence, 2022, 4 : 992 - 1003
  • [22] A learning result for continuous-time recurrent neural networks
    Sontag, Eduardo D.
    Systems and Control Letters, 1998, 34 (03): : 151 - 158
  • [23] CONTINUOUS-TIME DYNAMICS OF ASYMMETRICALLY DILUTED NEURAL NETWORKS
    KREE, R
    ZIPPELIUS, A
    PHYSICAL REVIEW A, 1987, 36 (09): : 4421 - 4427
  • [24] DISCRETE-TIME VERSUS CONTINUOUS-TIME MODELS OF NEURAL NETWORKS
    WANG, X
    BLUM, EK
    JOURNAL OF COMPUTER AND SYSTEM SCIENCES, 1992, 45 (01) : 1 - 19
  • [25] Analysis of Cluster Consensus in Continuous-Time Networks
    Develer, Umit
    Akar, Mehmet
    2018 ANNUAL AMERICAN CONTROL CONFERENCE (ACC), 2018, : 448 - 453
  • [26] Global dissipativity of continuous-time recurrent neural networks with time delay
    Liao, XX
    Wang, J
    PHYSICAL REVIEW E, 2003, 68 (01):
  • [27] Transient regime duration in continuous-time neural networks with delay
    Pakdaman, K
    Grotta-Ragazzo, C
    Malta, CP
    PHYSICAL REVIEW E, 1998, 58 (03): : 3623 - 3627
  • [29] State Derivative Normalization for Continuous-Time Deep Neural Networks
    Weigand, Jonas
    Beintema, Gerben I.
    Ulmen, Jonas
    Goerges, Daniel
    Toth, Roland
    Schoukens, Maarten
    Ruskowski, Martin
    IFAC PAPERSONLINE, 2024, 58 (15): : 253 - 258
  • [30] Integrated Neural Networks for Nonlinear Continuous-Time System Identification
    Mavkov, Bojan
    Forgione, Marco
    Piga, Dario
    IEEE CONTROL SYSTEMS LETTERS, 2020, 4 (04): : 851 - 856