Adaptive CL-BFGS Algorithms for Complex-Valued Neural Networks

被引:4
|
作者
Zhang, Yongliang [1 ,2 ]
Huang, He [1 ,2 ]
Shen, Gangxiang [1 ,2 ]
机构
[1] Soochow Univ, Sch Elect & Informat Engn, Suzhou 215006, Peoples R China
[2] Jiangsu Engn Res Ctr Novel Opt Fiber Technol & Co, Suzhou 215006, Peoples R China
关键词
Approximation algorithms; Training; Signal processing algorithms; Optimization; Neural networks; Upper bound; Mathematical models; Adaptive complex-valued limited-memory BFGS (ACL-BFGS) algorithm; complex-valued neural networks (CVNNs); moving average; multistep quasi-Newton method; variable memory size; CLASSIFICATION; CONVERGENCE;
D O I
10.1109/TNNLS.2021.3135553
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Complex-valued limited-memory BFGS (CL-BFGS) algorithm is efficient for the training of complex-valued neural networks (CVNNs). As an important parameter, the memory size represents the number of saved vector pairs and would essentially affect the performance of the algorithm. However, the determination of a suitable memory size for the CL-BFGS algorithm remains challenging. To deal with this issue, an adaptive method is proposed in which the memory size is allowed to vary during the iteration process. Basically, at each iteration, with the help of multistep quasi-Newton method, an appropriate memory size is chosen from a variable set {1,2,..., M} by approximating complex Hessian matrix as close as possible. To reduce the computational complexity and ensure desired performance, the upper bound M is adjustable according to the moving average of memory sizes found in previous iterations. The proposed adaptive CL-BFGS (ACL-BFGS) algorithm can be efficiently applied for the training of CVNNs. Moreover, it is suggested to take multiple memory sizes to construct the search direction, which further improves the performance of the ACL-BFGS algorithm. Experimental results on some benchmark problems including the pattern classification, complex function approximation, and nonlinear channel equalization problems are given to illustrate the advantages of the developed algorithms over some previous ones.
引用
收藏
页码:6313 / 6327
页数:15
相关论文
共 50 条
  • [21] DYNAMICS OF FULLY COMPLEX-VALUED NEURAL NETWORKS
    HIROSE, A
    ELECTRONICS LETTERS, 1992, 28 (16) : 1492 - 1494
  • [22] Stochastic Complex-valued Neural Networks for Radar
    Ouabi, Othmane-Latif
    Pribic, Radmila
    Olaru, Sorin
    28TH EUROPEAN SIGNAL PROCESSING CONFERENCE (EUSIPCO 2020), 2021, : 1442 - 1446
  • [23] Entanglement Detection with Complex-Valued Neural Networks
    Qu, Yue-Di
    Zhang, Rui-Qi
    Shen, Shu-Qian
    Yu, Juan
    Li, Ming
    INTERNATIONAL JOURNAL OF THEORETICAL PHYSICS, 2023, 62 (09)
  • [24] Adaptive Beamforming using Complex-valued Radial Basis Function Neural Networks
    Savitha, R.
    Vigneswaran, S.
    Suresh, S.
    Sundararajan, N.
    TENCON 2009 - 2009 IEEE REGION 10 CONFERENCE, VOLS 1-4, 2009, : 1133 - +
  • [25] Adaptive orthogonal gradient descent algorithm for fully complex-valued neural networks
    Zhao, Weijing
    Huang, He
    NEUROCOMPUTING, 2023, 546
  • [26] An individual adaptive gain parameter backpropagation algorithm for complex-valued neural networks
    Li, Songsong
    Okada, Toshimi
    Chen, Xiaoming
    Tang, Zheng
    ADVANCES IN NEURAL NETWORKS - ISNN 2006, PT 1, 2006, 3971 : 551 - 557
  • [27] A class of gradient-adaptive step size algorithms for complex-valued nonlinear neural adaptive filters
    Goh, SL
    Mandic, DP
    2005 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH, AND SIGNAL PROCESSING, VOLS 1-5: SPEECH PROCESSING, 2005, : 253 - 256
  • [28] Online censoring-based learning algorithms for fully complex-valued neural networks
    Menguc, Engin Cemal
    Mandic, Danilo P.
    NEUROCOMPUTING, 2025, 623
  • [29] Complex-Valued adaptive networks based on entropy estimation
    Wang, Gang
    Xue, Rui
    Zhou, Chao
    Gong, Junjie
    SIGNAL PROCESSING, 2018, 149 : 124 - 134
  • [30] Is a Complex-Valued Stepsize Advantageous in Complex-Valued Gradient Learning Algorithms?
    Zhang, Huisheng
    Mandic, Danilo P.
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2016, 27 (12) : 2730 - 2735