The general critical analysis for continuous-time UPPAM recurrent neural networks

被引:1
|
作者
Qiao, Chen [1 ,2 ]
Jing, Wen-Feng [1 ]
Fang, Jian [1 ,2 ]
Wang, Yu-Ping [2 ,3 ]
机构
[1] Xi An Jiao Tong Univ, Sch Math & Stat, Xian 710049, Peoples R China
[2] Tulane Univ, Dept Biomed Engn, New Orleans, LA 70118 USA
[3] Tulane Univ, Ctr Genom & Bioinformat, New Orleans, LA 70112 USA
基金
美国国家科学基金会;
关键词
Continuous-time recurrent neural network; Uniformly pseudo-projection-antimonotone network; General critical condition; Dynamical analysis; GLOBAL EXPONENTIAL STABILITY; OPTIMIZATION; CONVERGENCE;
D O I
10.1016/j.neucom.2015.09.103
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The uniformly pseudo-projection-anti-monotone (UPPAM) neural network model, which can be considered as the unified continuous-time neural networks (CNNs), includes almost all of the known CNNs individuals. Recently, studies on the critical dynamic behaviors of CNNs have drawn special attentions due to its importance in both theory and applications. In this paper, we will present the analysis of the UPPAM network under the general critical conditions. It is shown that the UPPAM network possesses the global convergence and asymptotical stability under the general critical conditions if the network satisfies one quasi-symmetric requirement on the connective matrices, which is easy to be verified and applied. The general critical dynamics have rarely been studied before, and this work is an attempt to gain a meaningful assurance of general critical convergence and stability of CNNs. Since UPPAM network is the unified model for CNNs, the results obtained here can generalize and extend the existing critical conclusions for CNNs individuals, let alone those non-critical cases. Moreover, the easily verified conditions for general critical convergence and stability can further promote the applications of CNNs. (C) 2015 Elsevier B.V. All rights reserved.
引用
收藏
页码:40 / 46
页数:7
相关论文
共 50 条
  • [21] Absolute exponential stability of a class of continuous-time recurrent neural networks
    Hu, SQ
    Wang, J
    IEEE TRANSACTIONS ON NEURAL NETWORKS, 2003, 14 (01): : 35 - 45
  • [22] Legendre Memory Units: Continuous-Time Representation in Recurrent Neural Networks
    Voelker, Aaron R.
    Kajic, Ivana
    Eliasmith, Chris
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 32 (NIPS 2019), 2019, 32
  • [23] Approximation and Optimization Theory for Linear Continuous-Time Recurrent Neural Networks
    Li, Zhong
    Han, Jiequn
    E, Weinan
    Li, Qianxiao
    JOURNAL OF MACHINE LEARNING RESEARCH, 2022, 23 : 1 - 85
  • [24] Approximation and Optimization Theory for Linear Continuous-Time Recurrent Neural Networks
    Li, Zhong
    Han, Jiequn
    Weinan, E.
    Li, Qianxiao
    Journal of Machine Learning Research, 2022, 23
  • [25] Continuous-time recurrent neural networks for generative and interactive musical performance
    Bown, Oliver
    Lexer, Sebastian
    APPLICATIONS OF EVOLUTIONARY COMPUTING, PROCEEDINGS, 2006, 3907 : 652 - 663
  • [26] APPROXIMATION OF DYNAMICAL-SYSTEMS BY CONTINUOUS-TIME RECURRENT NEURAL NETWORKS
    FUNAHASHI, K
    NAKAMURA, Y
    NEURAL NETWORKS, 1993, 6 (06) : 801 - 806
  • [27] Approximation of dynamical time-variant systems by continuous-time recurrent neural networks
    Li, XD
    Ho, JKL
    Chow, TWS
    IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS II-EXPRESS BRIEFS, 2005, 52 (10) : 656 - 660
  • [28] A Continuous-time Learning Rule for Memristor-based Recurrent Neural Networks
    Zoppo, Gianluca
    Marrone, Francesco
    Corinto, Fernando
    2019 26TH IEEE INTERNATIONAL CONFERENCE ON ELECTRONICS, CIRCUITS AND SYSTEMS (ICECS), 2019, : 494 - 497
  • [29] Global exponential stability in Lagrange sense of continuous-time recurrent neural networks
    Liao, Xiaoxin
    Zeng, Zhigang
    ADVANCES IN NEURAL NETWORKS - ISNN 2006, PT 1, 2006, 3971 : 115 - 121
  • [30] Homeostatic plasticity improves signal propagation in continuous-time recurrent neural networks
    Williams, Hywel
    Noble, Jason
    BIOSYSTEMS, 2007, 87 (2-3) : 252 - 259