Algorithm Research on Improving Activation Function of Convolutional Neural Networks

被引:0
|
作者
Guo, Yanhua [1 ]
Sun, Lei [1 ]
Zhang, Zhihong [2 ]
He, Hong [1 ]
机构
[1] Tianjin Univ Technol, Sch Elect Engn & Automat, Tianjin Key Lab Control Theory & Applicat Complic, Tianjin 300384, Peoples R China
[2] Tianjin Inst Elect Technol, Tianjin 300232, Peoples R China
关键词
Deep Learning; Convolutional Neural Networks; Activation function; Image classification;
D O I
10.1109/ccdc.2019.8833156
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Aiming at the slow convergence of t he activation function of Sigmoid, Tanh, ReLu and Softplus as the model and the non-convergence caused by gradient dispersion, this paper proposes an algorithm to improve the activation function of convolutional neural network. UsingR-SReLU as the activation function of the neural network, the convergence speed of various excitation functions to the network and the accuracy of image recognition are analyzed. The experimental data shows that the improved activation function R-SReLU not only has a fast convergence speed, but also has a small error rate, and can improve the accuracy of classification more effectively. The maximum recognition accuracy reaches 88.03%.
引用
收藏
页码:3582 / 3586
页数:5
相关论文
共 50 条
  • [31] Research on Boat Identification Based on Improved Loss Function of Deep Convolutional Neural Networks
    Guo, Yanhua
    Wang, Shuyu
    He, Hong
    Sun, Lei
    Ma, Shichao
    2019 WORLD ROBOT CONFERENCE SYMPOSIUM ON ADVANCED ROBOTICS AND AUTOMATION (WRC SARA 2019), 2019, : 278 - 283
  • [32] Stochastic Selection of Activation Layers for Convolutional Neural Networks
    Nanni, Loris
    Lumini, Alessandra
    Ghidoni, Stefano
    Maguolo, Gianluca
    SENSORS, 2020, 20 (06)
  • [33] Grad Centroid Activation Mapping for Convolutional Neural Networks
    Lafabregue, Baptiste
    Weber, Jonathan
    Gancarski, Pierre
    Forestier, Germain
    2021 IEEE 33RD INTERNATIONAL CONFERENCE ON TOOLS WITH ARTIFICIAL INTELLIGENCE (ICTAI 2021), 2021, : 184 - 191
  • [34] Dynamic Modification of Activation Function using the Backpropagation Algorithm in the Artificial Neural Networks
    Mercioni, Marina Adriana
    Tiron, Alexandru
    Holban, Stefan
    INTERNATIONAL JOURNAL OF ADVANCED COMPUTER SCIENCE AND APPLICATIONS, 2019, 10 (04) : 51 - 56
  • [35] Modification of backpropagation algorithm and its application for neural networks with threshold activation function
    Ptitchkin, V. A.
    ICNC 2007: THIRD INTERNATIONAL CONFERENCE ON NATURAL COMPUTATION, VOL 1, PROCEEDINGS, 2007, : 227 - 231
  • [36] A robust learning algorithm for feedforward neural networks with adaptive spline activation function
    Hu, LY
    Sun, ZQ
    ADVANCES IN NEURAL NETWORKS - ISNN 2005, PT 1, PROCEEDINGS, 2005, 3496 : 566 - 571
  • [37] A Tunnel Crack Identification Algorithm with Convolutional Neural Networks
    Gong, Qimin
    Wang, Yaodong
    Yu, Zujun
    Zhu, Liqiang
    Shi, Hongmei
    Huang, Haitao
    PROCEEDINGS OF 2018 IEEE 4TH INFORMATION TECHNOLOGY AND MECHATRONICS ENGINEERING CONFERENCE (ITOEC 2018), 2018, : 175 - 180
  • [38] An efficient image to column algorithm for convolutional neural networks
    Gong, Chunye
    Chen, Xinhai
    Lv, Shuling
    Liu, Jie
    Yang, Bo
    Wang, QingLin
    Bao, Weimin
    Pang, Yufei
    Sun, Yang
    2021 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2021,
  • [39] A New Improved Learning Algorithm for Convolutional Neural Networks
    Yang, Jie
    Zhao, Junhong
    Lu, Lu
    Pan, Tingting
    Jubair, Sidra
    PROCESSES, 2020, 8 (03)
  • [40] Convolutional neural networks recognition algorithm based on PCA
    Shi H.
    Xu Y.
    Ma S.
    Li Y.
    Li S.
    Xi'an Dianzi Keji Daxue Xuebao, 3 (161-166): : 161 - 166