Kurtosis-Based CRTRL Algorithms for Fully Connected Recurrent Neural Networks

被引:16
|
作者
Menguc, Engin Cemal [1 ]
Acir, Nurettin [2 ]
机构
[1] Nigde Omer Halisdemir Univ, Elect & Elect Engn Dept, TR-51245 Nigde, Turkey
[2] Bursa Tech Univ, Elect & Elect Engn Dept, TR-1619 Bursa, Turkey
关键词
Augmented statistics; circular and noncircular (NC) complex-valued signals; kurtosis; nonlinear complexvalued adaptive filter; LEAST-MEAN KURTOSIS; STOCHASTIC-ANALYSIS; COMPLEX;
D O I
10.1109/TNNLS.2018.2826442
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In this paper, kurtosis-based complex-valued real-time recurrent learning (KCRTRL) and kurtosis-based augmented CRTRL (KACRTRL) algorithms are proposed for training fully connected recurrent neural networks (FCRNNs) in the complex domain. These algorithms are designed by minimizing the cost functions based on the kurtosis of a complex-valued error signal. The KCRTRL algorithm exploits the circularity properties of the complex-valued signals, and this algorithm not only provides a faster convergence rate but also results in a lower steady-state error. However, the KCRTRL algorithm is suboptimal in the processing of noncircular (NC) complex-valued signals. On the other hand, the KACRTRL algorithm contains a complete second-order information due to the augmented statistics, thus considerably improves the performance of the FCRNN in the processing of NC complex-valued signals. Simulation results on the one-step-ahead prediction problems show that the proposed KCRTRL algorithm significantly enhances the performance for only circular complex-valued signals, whereas the proposed KACRTRL algorithm provides more superior performance than existing algorithms for NC complex-valued signals in terms of the convergence rate and the steady-state error.
引用
收藏
页码:6123 / 6131
页数:9
相关论文
共 50 条
  • [41] Estimation of a regression function on a manifold by fully connected deep neural networks
    Kohler, Michael
    Langer, Sophie
    Reif, Ulrich
    JOURNAL OF STATISTICAL PLANNING AND INFERENCE, 2023, 222 : 160 - 181
  • [42] Modeling Dynamic Hysteresis through Fully Connected Cascade Neural Networks
    Laudani, Antonino
    Lozito, Gabriele Maria
    Fulginei, Francesco Riganti
    Salvini, Alessandro
    2016 IEEE 2ND INTERNATIONAL FORUM ON RESEARCH AND TECHNOLOGIES FOR SOCIETY AND INDUSTRY LEVERAGING A BETTER TOMORROW (RTSI), 2016, : 387 - 391
  • [43] Generalization in fully-connected neural networks for time series forecasting
    Borovykh, Anastasia
    Oosterlee, Cornelis W.
    Bohte, Sander M.
    JOURNAL OF COMPUTATIONAL SCIENCE, 2019, 36
  • [44] AirFC: Designing Fully Connected Layers for Neural Networks with Wireless Signals
    Reus-Muns, Guillem
    Alemdar, Kubra
    Sanchez, Sara Garcia
    Roy, Debashri
    Chowdhury, Kaushik R.
    PROCEEDINGS OF THE 2023 INTERNATIONAL SYMPOSIUM ON THEORY, ALGORITHMIC FOUNDATIONS, AND PROTOCOL DESIGN FOR MOBILE NETWORKS AND MOBILE COMPUTING, MOBIHOC 2023, 2023, : 71 - 80
  • [45] THERMODYNAMIC PROPERTIES OF FULLY CONNECTED Q-ISING NEURAL NETWORKS
    BOLLE, D
    RIEGER, H
    SHIM, GM
    JOURNAL OF PHYSICS A-MATHEMATICAL AND GENERAL, 1994, 27 (10): : 3411 - 3426
  • [46] A novel structured sparse fully connected layer in convolutional neural networks
    Matsumura, Naoki
    Ito, Yasuaki
    Nakano, Koji
    Kasagi, Akihiko
    Tabaru, Tsuguchika
    CONCURRENCY AND COMPUTATION-PRACTICE & EXPERIENCE, 2023, 35 (11):
  • [47] Parallel dynamics of fully connected Q-Ising neural networks
    Bolle, D
    Jongen, G
    Shim, GM
    JOURNAL OF STATISTICAL PHYSICS, 1998, 91 (1-2) : 125 - 153
  • [48] Thermodynamics of fully connected Blume-Emery-Griffiths neural networks
    Bollé, D
    Verbeiren, T
    JOURNAL OF PHYSICS A-MATHEMATICAL AND GENERAL, 2003, 36 (02): : 295 - 305
  • [49] Parallel Dynamics of Fully Connected Q-Ising Neural Networks
    D. Bollé
    G. Jongen
    G. M. Shim
    Journal of Statistical Physics, 1998, 91 : 125 - 153
  • [50] Depth Degeneracy in Neural Networks: Vanishing Angles in Fully Connected ReLU Networks on Initialization
    Jakub, Cameron
    Nica, Mihai
    JOURNAL OF MACHINE LEARNING RESEARCH, 2024, 25 : 1 - 45