Multi-output incremental back-propagation

被引:0
|
作者
Rachana Chaudhari
Dhwani Agarwal
Kritika Ravishankar
Nikita Masand
Vijay K. Sambhe
Sandeep S. Udmale
机构
[1] Veermata Jijabai Technological Institute (VJTI),Department of Computer Engineering and Information Technology
来源
关键词
Backpropagation; Hybrid layer; Optimal architecture; Weight initialization;
D O I
暂无
中图分类号
学科分类号
摘要
Deep learning techniques can form generalized models that can solve any problem that is not solvable by traditional approaches. It explains the omnipresence of deep learning models across all domains. However, a lot of time is spent on finding the optimal hyperparameters to help the model generalize and give the highest accuracy. This paper investigates a proposed model incorporating hybrid layers and a novel approach for weight initialization aimed at—(1) Reducing the overall trial and error time spent in finding the optimal number of layers by providing the necessary insights. (2) Reducing the randomness in weight initialization with the help of a novel incremental backpropagation based model architecture. The model, along with the principal component analysis-based initialization, substantially provides a stable weight initialization, thereby improving the train and test performance and speeding up the process of convergence to an optimal solution. Furthermore, three data sets were tested on the proposed approach, and they outperformed the state-of-the-art initialization methods.
引用
收藏
页码:14897 / 14910
页数:13
相关论文
共 50 条
  • [1] Multi-output incremental back-propagation
    Chaudhari, Rachana
    Agarwal, Dhwani
    Ravishankar, Kritika
    Masand, Nikita
    Sambhe, Vijay K. K.
    Udmale, Sandeep S. S.
    NEURAL COMPUTING & APPLICATIONS, 2023, 35 (20): : 14897 - 14910
  • [2] BACK-PROPAGATION
    JONES, WP
    HOSKINS, J
    BYTE, 1987, 12 (11): : 155 - &
  • [3] Back-propagation of accuracy
    Senashova, MY
    Gorban, AN
    Wunsch, DC
    1997 IEEE INTERNATIONAL CONFERENCE ON NEURAL NETWORKS, VOLS 1-4, 1997, : 1998 - 2001
  • [4] Back-propagation with Chaos
    Fazayeli, Farideh
    Wang, Lipo
    Liu, Wen
    2008 INTERNATIONAL CONFERENCE ON NEURAL NETWORKS AND SIGNAL PROCESSING, VOLS 1 AND 2, 2007, : 5 - 8
  • [5] Back-propagation is not efficient
    Sima, J
    NEURAL NETWORKS, 1996, 9 (06) : 1017 - 1023
  • [6] Sequential Back-Propagation
    王晖
    刘大有
    王亚飞
    JournalofComputerScienceandTechnology, 1994, (03) : 252 - 260
  • [7] Improving back-propagation: Epsilon-back-propagation
    Trejo, LA
    Sandoval, C
    FROM NATURAL TO ARTIFICIAL NEURAL COMPUTATION, 1995, 930 : 427 - 432
  • [8] Error back-propagation in multi-valued logic systems
    Apostolikas, Georgios
    Konstantopoulos, Stasinos
    ICCIMA 2007: INTERNATIONAL CONFERENCE ON COMPUTATIONAL INTELLIGENCE AND MULTIMEDIA APPLICATIONS, VOL IV, PROCEEDINGS, 2007, : 207 - 213
  • [9] FEATURE CONSTRUCTION FOR BACK-PROPAGATION
    PIRAMUTHU, S
    LECTURE NOTES IN COMPUTER SCIENCE, 1991, 496 : 264 - 268
  • [10] On the Local Hessian in Back-propagation
    Zhang, Huishuai
    Chen, Wei
    Liu, Tie-Yan
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 31 (NIPS 2018), 2018, 31