Algorithm Research on Improving Activation Function of Convolutional Neural Networks

被引:0
|
作者
Guo, Yanhua [1 ]
Sun, Lei [1 ]
Zhang, Zhihong [2 ]
He, Hong [1 ]
机构
[1] Tianjin Univ Technol, Sch Elect Engn & Automat, Tianjin Key Lab Control Theory & Applicat Complic, Tianjin 300384, Peoples R China
[2] Tianjin Inst Elect Technol, Tianjin 300232, Peoples R China
关键词
Deep Learning; Convolutional Neural Networks; Activation function; Image classification;
D O I
10.1109/ccdc.2019.8833156
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Aiming at the slow convergence of t he activation function of Sigmoid, Tanh, ReLu and Softplus as the model and the non-convergence caused by gradient dispersion, this paper proposes an algorithm to improve the activation function of convolutional neural network. UsingR-SReLU as the activation function of the neural network, the convergence speed of various excitation functions to the network and the accuracy of image recognition are analyzed. The experimental data shows that the improved activation function R-SReLU not only has a fast convergence speed, but also has a small error rate, and can improve the accuracy of classification more effectively. The maximum recognition accuracy reaches 88.03%.
引用
收藏
页码:3582 / 3586
页数:5
相关论文
共 50 条
  • [1] RETRACTED: Improving Convolutional Neural Networks with Competitive Activation Function (Retracted Article)
    Ying, Yao
    Zhang, Nengbo
    He, Ping
    Peng, Silong
    SECURITY AND COMMUNICATION NETWORKS, 2021, 2021
  • [2] Optimizing nonlinear activation function for convolutional neural networks
    Munender Varshney
    Pravendra Singh
    Signal, Image and Video Processing, 2021, 15 : 1323 - 1330
  • [3] Optimizing nonlinear activation function for convolutional neural networks
    Varshney, Munender
    Singh, Pravendra
    SIGNAL IMAGE AND VIDEO PROCESSING, 2021, 15 (06) : 1323 - 1330
  • [4] Deep green function convolution for improving saliency in convolutional neural networks
    Dominique Beaini
    Sofiane Achiche
    Alexandre Duperré
    Maxime Raison
    The Visual Computer, 2021, 37 : 227 - 244
  • [5] Deep green function convolution for improving saliency in convolutional neural networks
    Beaini, Dominique
    Achiche, Sofiane
    Duperre, Alexandre
    Raison, Maxime
    VISUAL COMPUTER, 2021, 37 (02): : 227 - 244
  • [6] Convolutional Neural Network Algorithm with Parameterized Activation Function for Melanoma Classification
    Namozov, Abdulaziz
    Cho, Young Im
    2018 INTERNATIONAL CONFERENCE ON INFORMATION AND COMMUNICATION TECHNOLOGY CONVERGENCE (ICTC), 2018, : 417 - 419
  • [7] SinP[N]: A Fast Convergence Activation Function for Convolutional Neural Networks
    Chan, Ka-Hou
    Im, Sio-Kei
    Ke, Wei
    Lei, Ngan-Lin
    2018 IEEE/ACM INTERNATIONAL CONFERENCE ON UTILITY AND CLOUD COMPUTING COMPANION (UCC COMPANION), 2018, : 365 - 369
  • [8] Improving robustness of convolutional neural networks using element-wise activation scaling
    Zhang, Zhi-Yuan
    Ren, Hao
    He, Zhenli
    Zhou, Wei
    Liu, Di
    FUTURE GENERATION COMPUTER SYSTEMS-THE INTERNATIONAL JOURNAL OF ESCIENCE, 2023, 149 : 136 - 148
  • [9] Training 4.6-Bit Convolutional Neural Networks with a HardTanh Activation Function
    Trusov, A. V.
    PATTERN RECOGNITION AND IMAGE ANALYSIS, 2025, 35 (01) : 44 - 64
  • [10] Improving Precipitation Forecasts with Convolutional Neural Networks
    Badrinath, Anirudhan
    delle Monache, Luca
    Hayatbini, Negin
    Chapman, Will
    Cannon, Forest
    Ralph, Marty
    WEATHER AND FORECASTING, 2023, 38 (02) : 291 - 306