The general critical analysis for continuous-time UPPAM recurrent neural networks

被引:1
|
作者
Qiao, Chen [1 ,2 ]
Jing, Wen-Feng [1 ]
Fang, Jian [1 ,2 ]
Wang, Yu-Ping [2 ,3 ]
机构
[1] Xi An Jiao Tong Univ, Sch Math & Stat, Xian 710049, Peoples R China
[2] Tulane Univ, Dept Biomed Engn, New Orleans, LA 70118 USA
[3] Tulane Univ, Ctr Genom & Bioinformat, New Orleans, LA 70112 USA
基金
美国国家科学基金会;
关键词
Continuous-time recurrent neural network; Uniformly pseudo-projection-antimonotone network; General critical condition; Dynamical analysis; GLOBAL EXPONENTIAL STABILITY; OPTIMIZATION; CONVERGENCE;
D O I
10.1016/j.neucom.2015.09.103
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The uniformly pseudo-projection-anti-monotone (UPPAM) neural network model, which can be considered as the unified continuous-time neural networks (CNNs), includes almost all of the known CNNs individuals. Recently, studies on the critical dynamic behaviors of CNNs have drawn special attentions due to its importance in both theory and applications. In this paper, we will present the analysis of the UPPAM network under the general critical conditions. It is shown that the UPPAM network possesses the global convergence and asymptotical stability under the general critical conditions if the network satisfies one quasi-symmetric requirement on the connective matrices, which is easy to be verified and applied. The general critical dynamics have rarely been studied before, and this work is an attempt to gain a meaningful assurance of general critical convergence and stability of CNNs. Since UPPAM network is the unified model for CNNs, the results obtained here can generalize and extend the existing critical conclusions for CNNs individuals, let alone those non-critical cases. Moreover, the easily verified conditions for general critical convergence and stability can further promote the applications of CNNs. (C) 2015 Elsevier B.V. All rights reserved.
引用
收藏
页码:40 / 46
页数:7
相关论文
共 50 条
  • [31] Causal Navigation by Continuous-time Neural Networks
    Vorbach, Charles
    Hasani, Ramin
    Amini, Alexander
    Lechner, Mathias
    Rus, Daniela
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021, 34
  • [32] On unique representations of certain dynamical systems produced by continuous-time recurrent neural networks
    Kimura, M
    NEURAL COMPUTATION, 2002, 14 (12) : 2981 - 2996
  • [33] Global asymptotic stability and global exponential stability of continuous-time recurrent neural networks
    Hu, SQ
    Wang, J
    IEEE TRANSACTIONS ON AUTOMATIC CONTROL, 2002, 47 (05) : 802 - 807
  • [34] Codimension-2 parameter space structure of continuous-time recurrent neural networks
    Randall D. Beer
    Biological Cybernetics, 2022, 116 : 501 - 515
  • [35] ON-LINE CONTINUOUS-TIME MUSIC MOOD REGRESSION WITH DEEP RECURRENT NEURAL NETWORKS
    Weninger, Felix
    Eyben, Florian
    Schuller, Bjoern
    2014 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2014,
  • [36] Codimension-2 parameter space structure of continuous-time recurrent neural networks
    Beer, Randall D.
    BIOLOGICAL CYBERNETICS, 2022, 116 (04) : 501 - 515
  • [37] Convergence Analysis of Continuous-Time Systems Based on Feedforward Neural Networks
    Huang, Yuzhu
    Liu, Derong
    Wei, Qinglai
    2013 IEEE INTERNATIONAL SYMPOSIUM ON CIRCUITS AND SYSTEMS (ISCAS), 2013, : 2095 - 2098
  • [38] Non-Euclidean Contraction Analysis of Continuous-Time Neural Networks
    Davydov, Alexander
    Proskurnikov, Anton V.
    Bullo, Francesco
    IEEE TRANSACTIONS ON AUTOMATIC CONTROL, 2025, 70 (01) : 235 - 250
  • [39] Novel delay-distribution-dependent stability analysis for continuous-time recurrent neural networks with stochastic delay
    王申全
    冯健
    赵青
    Chinese Physics B, 2012, (12) : 161 - 167
  • [40] Novel delay-distribution-dependent stability analysis for continuous-time recurrent neural networks with stochastic delay
    Wang Shen-Quan
    Feng Jian
    Zhao Qing
    CHINESE PHYSICS B, 2012, 21 (12)