The general critical analysis for continuous-time UPPAM recurrent neural networks

被引:1
|
作者
Qiao, Chen [1 ,2 ]
Jing, Wen-Feng [1 ]
Fang, Jian [1 ,2 ]
Wang, Yu-Ping [2 ,3 ]
机构
[1] Xi An Jiao Tong Univ, Sch Math & Stat, Xian 710049, Peoples R China
[2] Tulane Univ, Dept Biomed Engn, New Orleans, LA 70118 USA
[3] Tulane Univ, Ctr Genom & Bioinformat, New Orleans, LA 70112 USA
基金
美国国家科学基金会;
关键词
Continuous-time recurrent neural network; Uniformly pseudo-projection-antimonotone network; General critical condition; Dynamical analysis; GLOBAL EXPONENTIAL STABILITY; OPTIMIZATION; CONVERGENCE;
D O I
10.1016/j.neucom.2015.09.103
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The uniformly pseudo-projection-anti-monotone (UPPAM) neural network model, which can be considered as the unified continuous-time neural networks (CNNs), includes almost all of the known CNNs individuals. Recently, studies on the critical dynamic behaviors of CNNs have drawn special attentions due to its importance in both theory and applications. In this paper, we will present the analysis of the UPPAM network under the general critical conditions. It is shown that the UPPAM network possesses the global convergence and asymptotical stability under the general critical conditions if the network satisfies one quasi-symmetric requirement on the connective matrices, which is easy to be verified and applied. The general critical dynamics have rarely been studied before, and this work is an attempt to gain a meaningful assurance of general critical convergence and stability of CNNs. Since UPPAM network is the unified model for CNNs, the results obtained here can generalize and extend the existing critical conclusions for CNNs individuals, let alone those non-critical cases. Moreover, the easily verified conditions for general critical convergence and stability can further promote the applications of CNNs. (C) 2015 Elsevier B.V. All rights reserved.
引用
收藏
页码:40 / 46
页数:7
相关论文
共 50 条
  • [1] Stability Analysis of a General Class of Continuous-Time Recurrent Neural Networks
    Fu, Chaojin
    Wang, Zhongsheng
    ADVANCES IN NEURAL NETWORKS - ISNN 2009, PT 1, PROCEEDINGS, 2009, 5551 : 340 - +
  • [2] Output convergence analysis of continuous-time recurrent neural networks
    Liu, DR
    Hu, SQ
    PROCEEDINGS OF THE 2003 IEEE INTERNATIONAL SYMPOSIUM ON CIRCUITS AND SYSTEMS, VOL III: GENERAL & NONLINEAR CIRCUITS AND SYSTEMS, 2003, : 466 - 469
  • [3] The UPPAM continuous-time RNN model and its critical dynamics study
    Qiao, Chen
    Jing, Wenfeng
    Xu, Zongben
    NEUROCOMPUTING, 2013, 106 : 158 - 166
  • [4] Search space analysis of recurrent spiking and continuous-time neural networks
    Ventresca, Mario
    Ombuki, Beatrice
    2006 IEEE INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORK PROCEEDINGS, VOLS 1-10, 2006, : 4514 - +
  • [5] A Comprehensive Review of Stability Analysis of Continuous-Time Recurrent Neural Networks
    Zhang, Huaguang
    Wang, Zhanshan
    Liu, Derong
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2014, 25 (07) : 1229 - 1262
  • [6] ON THE DYNAMICS OF SMALL CONTINUOUS-TIME RECURRENT NEURAL NETWORKS
    BEER, RD
    ADAPTIVE BEHAVIOR, 1995, 3 (04) : 469 - 509
  • [7] A learning result for continuous-time recurrent neural networks
    Sontag, ED
    SYSTEMS & CONTROL LETTERS, 1998, 34 (03) : 151 - 158
  • [8] Noisy recurrent neural networks: The continuous-time case
    Das, S
    Olurotimi, O
    IEEE TRANSACTIONS ON NEURAL NETWORKS, 1998, 9 (05): : 913 - 936
  • [9] A learning result for continuous-time recurrent neural networks
    Sontag, Eduardo D.
    Systems and Control Letters, 1998, 34 (03): : 151 - 158
  • [10] Complete controllability of continuous-time recurrent neural networks
    Sontag, E
    Sussmann, H
    SYSTEMS & CONTROL LETTERS, 1997, 30 (04) : 177 - 183