Deterministic convergence of complex mini-batch gradient learning algorithm for fully complex-valued neural networks

被引:12
|
作者
Zhang, Huisheng [1 ]
Zhang, Ying [1 ]
Zhu, Shuai [1 ]
Xu, Dongpo [2 ]
机构
[1] Dalian Maritime Univ, Sch Sci, Dalian 116026, Peoples R China
[2] Northeast Normal Univ, Sch Math & Stat, Changchun 130024, Peoples R China
基金
中国国家自然科学基金;
关键词
Fully complex-valued neural networks; Mini-batch gradient algorithm; Convergence; Wirtinger calculus; BACKPROPAGATION ALGORITHM; PERFORMANCE BOUNDS; MOMENTUM; BOUNDEDNESS; ESTIMATORS; LMS;
D O I
10.1016/j.neucom.2020.04.114
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
This paper investigates the fully complex mini-batch gradient algorithm for training complex-valued neural networks. Mini-batch gradient method has been widely used in neural network training, however, its convergence analysis is usually restricted to real-valued neural networks and of probability nature. By introducing a new Taylor mean value theorem for analytic functions, in this paper we establish determin-istic convergence results for the fully complex mini-batch gradient algorithm under mild conditions. The deterministic convergence here means that the algorithm will deterministically converge, and both the weak convergence and strong convergence will be proved. Benefited from the newly introduced mean value theorem, our results are of global nature in that they are valid for arbitrarily given initial values of the weights. The theoretical findings are validated with a simulation example. (C) 2020 Elsevier B.V. All rights reserved.
引用
收藏
页码:185 / 193
页数:9
相关论文
共 50 条
  • [1] Deterministic Convergence of Wirtinger-Gradient Methods for Complex-Valued Neural Networks
    Dongpo Xu
    Jian Dong
    Huisheng Zhang
    Neural Processing Letters, 2017, 45 : 445 - 456
  • [2] Deterministic Convergence of Wirtinger-Gradient Methods for Complex-Valued Neural Networks
    Xu, Dongpo
    Dong, Jian
    Zhang, Huisheng
    NEURAL PROCESSING LETTERS, 2017, 45 (02) : 445 - 456
  • [3] Convergence of Batch Split-Complex Backpropagation Algorithm for Complex-Valued Neural Networks
    Zhang, Huisheng
    Zhang, Chao
    Wu, Wei
    DISCRETE DYNAMICS IN NATURE AND SOCIETY, 2009, 2009
  • [4] Convergence analysis of an augmented algorithm for fully complex-valued neural networks
    Xu, Dongpo
    Zhang, Huisheng
    Mandic, Danilo P.
    NEURAL NETWORKS, 2015, 69 : 44 - 50
  • [5] Convergence of an Online Split-Complex Gradient Algorithm for Complex-Valued Neural Networks
    Zhang, Huisheng
    Xu, Dongpo
    Wang, Zhiping
    DISCRETE DYNAMICS IN NATURE AND SOCIETY, 2010, 2010
  • [6] Adaptive orthogonal gradient descent algorithm for fully complex-valued neural networks
    Zhao, Weijing
    Huang, He
    NEUROCOMPUTING, 2023, 546
  • [7] A New Learning Algorithm for Fully Complex-Valued RBF Networks
    Liu, Shulin
    Huang, He
    2016 31ST YOUTH ACADEMIC ANNUAL CONFERENCE OF CHINESE ASSOCIATION OF AUTOMATION (YAC), 2016, : 469 - 474
  • [8] Complex-Valued Neural Network and Complex-Valued Backpropagation Learning Algorithm
    Nitta, Tohru
    ADVANCES IN IMAGING AND ELECTRON PHYSICS, VOL 152, 2008, 152 : 153 - 220
  • [9] Scaled Conjugate Gradient Learning for Complex-Valued Neural Networks
    Popa, Calin-Adrian
    MENDEL 2015: RECENT ADVANCES IN SOFT COMPUTING, 2015, 378 : 221 - 233
  • [10] DYNAMICS OF FULLY COMPLEX-VALUED NEURAL NETWORKS
    HIROSE, A
    ELECTRONICS LETTERS, 1992, 28 (16) : 1492 - 1494