Adaptive CL-BFGS Algorithms for Complex-Valued Neural Networks

被引:4
|
作者
Zhang, Yongliang [1 ,2 ]
Huang, He [1 ,2 ]
Shen, Gangxiang [1 ,2 ]
机构
[1] Soochow Univ, Sch Elect & Informat Engn, Suzhou 215006, Peoples R China
[2] Jiangsu Engn Res Ctr Novel Opt Fiber Technol & Co, Suzhou 215006, Peoples R China
关键词
Approximation algorithms; Training; Signal processing algorithms; Optimization; Neural networks; Upper bound; Mathematical models; Adaptive complex-valued limited-memory BFGS (ACL-BFGS) algorithm; complex-valued neural networks (CVNNs); moving average; multistep quasi-Newton method; variable memory size; CLASSIFICATION; CONVERGENCE;
D O I
10.1109/TNNLS.2021.3135553
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Complex-valued limited-memory BFGS (CL-BFGS) algorithm is efficient for the training of complex-valued neural networks (CVNNs). As an important parameter, the memory size represents the number of saved vector pairs and would essentially affect the performance of the algorithm. However, the determination of a suitable memory size for the CL-BFGS algorithm remains challenging. To deal with this issue, an adaptive method is proposed in which the memory size is allowed to vary during the iteration process. Basically, at each iteration, with the help of multistep quasi-Newton method, an appropriate memory size is chosen from a variable set {1,2,..., M} by approximating complex Hessian matrix as close as possible. To reduce the computational complexity and ensure desired performance, the upper bound M is adjustable according to the moving average of memory sizes found in previous iterations. The proposed adaptive CL-BFGS (ACL-BFGS) algorithm can be efficiently applied for the training of CVNNs. Moreover, it is suggested to take multiple memory sizes to construct the search direction, which further improves the performance of the ACL-BFGS algorithm. Experimental results on some benchmark problems including the pattern classification, complex function approximation, and nonlinear channel equalization problems are given to illustrate the advantages of the developed algorithms over some previous ones.
引用
收藏
页码:6313 / 6327
页数:15
相关论文
共 50 条
  • [41] Orthogonality of decision boundaries in complex-valued neural networks
    Nitta, T
    NEURAL COMPUTATION, 2004, 16 (01) : 73 - 97
  • [42] Optimal approximation using complex-valued neural networks
    Geuchen, Paul
    Voigtlaender, Felix
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36 (NEURIPS 2023), 2023,
  • [43] FRACTAL VARIATION OF ATTRACTORS IN COMPLEX-VALUED NEURAL NETWORKS
    HIROSE, A
    NEURAL PROCESSING LETTERS, 1994, 1 (01) : 6 - 8
  • [44] Relaxation of the stability condition of the complex-valued neural networks
    Lee, DL
    IEEE TRANSACTIONS ON NEURAL NETWORKS, 2001, 12 (05): : 1260 - 1262
  • [45] Quantitative Approximation Results for Complex-Valued Neural Networks
    Caragea, Andrei
    Lee, Dae Gwan
    Maly, Johannes
    Pfander, Goetz
    Voigtlaender, Felix
    SIAM JOURNAL ON MATHEMATICS OF DATA SCIENCE, 2022, 4 (02): : 553 - 580
  • [46] An Introduction to Complex-Valued Recurrent Correlation Neural Networks
    Valle, Marcos Eduardo
    PROCEEDINGS OF THE 2014 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2014, : 3387 - 3394
  • [47] Representation of complex-valued neural networks: A real-valued approach
    Yadav, A
    Mishra, D
    Ray, S
    Yadav, RN
    Kalra, PK
    2005 International Conference on Intelligent Sensing and Information Processing, Proceedings, 2005, : 331 - 335
  • [48] Multistability of complex-valued neural networks with distributed delays
    Gong, Weiqiang
    Liang, Jinling
    Zhang, Congjun
    NEURAL COMPUTING & APPLICATIONS, 2017, 28 : S1 - S14
  • [49] An augmented CRTRL for complex-valued recurrent neural networks
    Goh, Su Lee
    Mandic, Danilo P.
    NEURAL NETWORKS, 2007, 20 (10) : 1061 - 1066
  • [50] Multistability and Multiperiodicity Analysis of Complex-Valued Neural Networks
    Hu, Jin
    Wang, Jun
    ADVANCES IN NEURAL NETWORKS - ISNN 2014, 2014, 8866 : 59 - 68