The general critical analysis for continuous-time UPPAM recurrent neural networks

被引:1
|
作者
Qiao, Chen [1 ,2 ]
Jing, Wen-Feng [1 ]
Fang, Jian [1 ,2 ]
Wang, Yu-Ping [2 ,3 ]
机构
[1] Xi An Jiao Tong Univ, Sch Math & Stat, Xian 710049, Peoples R China
[2] Tulane Univ, Dept Biomed Engn, New Orleans, LA 70118 USA
[3] Tulane Univ, Ctr Genom & Bioinformat, New Orleans, LA 70112 USA
基金
美国国家科学基金会;
关键词
Continuous-time recurrent neural network; Uniformly pseudo-projection-antimonotone network; General critical condition; Dynamical analysis; GLOBAL EXPONENTIAL STABILITY; OPTIMIZATION; CONVERGENCE;
D O I
10.1016/j.neucom.2015.09.103
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The uniformly pseudo-projection-anti-monotone (UPPAM) neural network model, which can be considered as the unified continuous-time neural networks (CNNs), includes almost all of the known CNNs individuals. Recently, studies on the critical dynamic behaviors of CNNs have drawn special attentions due to its importance in both theory and applications. In this paper, we will present the analysis of the UPPAM network under the general critical conditions. It is shown that the UPPAM network possesses the global convergence and asymptotical stability under the general critical conditions if the network satisfies one quasi-symmetric requirement on the connective matrices, which is easy to be verified and applied. The general critical dynamics have rarely been studied before, and this work is an attempt to gain a meaningful assurance of general critical convergence and stability of CNNs. Since UPPAM network is the unified model for CNNs, the results obtained here can generalize and extend the existing critical conclusions for CNNs individuals, let alone those non-critical cases. Moreover, the easily verified conditions for general critical convergence and stability can further promote the applications of CNNs. (C) 2015 Elsevier B.V. All rights reserved.
引用
收藏
页码:40 / 46
页数:7
相关论文
共 50 条
  • [41] Global output convergence of a class of continuous-time recurrent neural networks with time-varying thresholds
    Liu, DR
    Hu, SQ
    Wang, J
    IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS II-EXPRESS BRIEFS, 2004, 51 (04) : 161 - 167
  • [42] Analysis of continuous-time switching networks
    Edwards, R
    PHYSICA D-NONLINEAR PHENOMENA, 2000, 146 (1-4) : 165 - 199
  • [43] Continuous-time and discrete-time cellular neural networks
    Yang, T
    ADVANCES IN IMAGING AND ELECTRON PHYSICS, VOL 114, 2000, 114 : 79 - 324
  • [44] Some Characterizations of Global Exponential Stability of a Generic Class of Continuous-Time Recurrent Neural Networks
    Wang, Lisheng
    Zhang, Rui
    Xu, Zongben
    Peng, Jigen
    IEEE TRANSACTIONS ON SYSTEMS MAN AND CYBERNETICS PART B-CYBERNETICS, 2009, 39 (03): : 763 - 772
  • [45] Dynamics of continuous-time recurrent neural networks with random connection weights and unbounded distributed delays
    Meiyu Sui
    Yejuan Wang
    Peter E. Kloeden
    Xiaoying Han
    The European Physical Journal Plus, 136
  • [46] Dynamics of continuous-time recurrent neural networks with random connection weights and unbounded distributed delays
    Sui, Meiyu
    Wang, Yejuan
    Kloeden, Peter E.
    Han, Xiaoying
    EUROPEAN PHYSICAL JOURNAL PLUS, 2021, 136 (08):
  • [47] Closed-form continuous-time neural networks
    Hasani, Ramin
    Lechner, Mathias
    Amini, Alexander
    Liebenwein, Lucas
    Ray, Aaron
    Tschaikowski, Max
    Teschl, Gerald
    Rus, Daniela
    NATURE MACHINE INTELLIGENCE, 2022, 4 (11) : 992 - +
  • [48] On stability of nonlinear continuous-time neural networks with delays
    Lu, HT
    NEURAL NETWORKS, 2000, 13 (10) : 1135 - 1143
  • [49] Closed-form continuous-time neural networks
    Ramin Hasani
    Mathias Lechner
    Alexander Amini
    Lucas Liebenwein
    Aaron Ray
    Max Tschaikowski
    Gerald Teschl
    Daniela Rus
    Nature Machine Intelligence, 2022, 4 : 992 - 1003
  • [50] CONTINUOUS-TIME DYNAMICS OF ASYMMETRICALLY DILUTED NEURAL NETWORKS
    KREE, R
    ZIPPELIUS, A
    PHYSICAL REVIEW A, 1987, 36 (09): : 4421 - 4427