SGD method for entropy error function with smoothing l0 regularization for neural networks

被引:0
|
作者
Nguyen, Trong-Tuan [1 ]
Thang, Van-Dat [2 ]
Nguyen, Van Thin [3 ]
Nguyen, Phuong T. [4 ]
机构
[1] VNPT AI, Hanoi, Vietnam
[2] Viettel High Technol Ind Corp, Hanoi, Vietnam
[3] Thai Nguyen Univ Educ, 20 Luong Ngoc Quyen St, Thai Nguyen City, Vietnam
[4] Univ Aquila, Dept Informat Engn Comp Sci & Math, Via Vetoio Snc, Laquila, Italy
关键词
Neural networks; l0; regularization; Entropy function; L-1/2; REGULARIZATION; GRADIENT DESCENT; APPROXIMATION; LAYER;
D O I
10.1007/s10489-024-05564-1
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The entropy error function has been widely used in neural networks. Nevertheless, the network training based on this error function generally leads to a slow convergence rate, and can easily be trapped in a local minimum or even with the incorrect saturation problem in practice. In fact, there are many results based on entropy error function in neural network and its applications. However, the theory of such an algorithm and its convergence have not been fully studied so far. To tackle the issue, this works proposes a novel entropy function with smoothing l(0) regularization for feed-forward neural networks. An empirical evaluation has been conducted on real-world datasets to demonstrate that the newly conceived algorithm allows us to substantially improve the prediction performance of the considered neural networks. More importantly, the experimental results also show that the proposed function brings in more precise classifications, compared to well-founded baselines. The work is novel as it enables neural networks to learn effectively, producing more accurate predictions compared to state-of-the-art algorithms. In this respect, it is expected that the algorithm will contribute to existing studies in the field, advancing research in Machine Learning and Deep Learning.
引用
收藏
页码:7213 / 7228
页数:16
相关论文
共 50 条
  • [1] Online gradient method with smoothing l0 regularization for feedforward neural networks
    Zhang, Huisheng
    Tang, Yanli
    NEUROCOMPUTING, 2017, 224 : 1 - 8
  • [2] Batch gradient training method with smoothing regularization for l0 feedforward neural networks
    Zhang, Huisheng
    Tang, Yanli
    Liu, Xiaodong
    NEURAL COMPUTING & APPLICATIONS, 2015, 26 (02): : 383 - 390
  • [3] Batch Gradient Training Method with Smoothing Group L0 Regularization for Feedfoward Neural Networks
    Zhang, Ying
    Wei, Jianing
    Xu, Dongpo
    Zhang, Huisheng
    NEURAL PROCESSING LETTERS, 2023, 55 (02) : 1663 - 1679
  • [4] Smoothing L0 Regularization for Extreme Learning Machine
    Fan, Qinwei
    Liu, Ting
    1600, Hindawi Limited, 410 Park Avenue, 15th Floor, 287 pmb, New York, NY 10022, United States (2020):
  • [5] Smoothing L0 Regularization for Extreme Learning Machine
    Fan, Qinwei
    Liu, Ting
    MATHEMATICAL PROBLEMS IN ENGINEERING, 2020, 2020
  • [6] Entropic Regularization of the l0 Function
    Borwein, Jonathan M.
    Luke, D. Russell
    FIXED-POINT ALGORITHMS FOR INVERSE PROBLEMS IN SCIENCE AND ENGINEERING, 2011, 49 : 65 - +
  • [7] SGD method for entropy error function with smoothing l0\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$l_0$$\end{document} regularization for neural networks
    Trong-Tuan Nguyen
    Van-Dat Thang
    Van Thin Nguyen
    Phuong T. Nguyen
    Applied Intelligence, 2024, 54 (13-14) : 7213 - 7228
  • [8] Sparse smooth group L0°L1/2 regularization method for convolutional neural networks
    Quasdane, Mohamed
    Ramchoun, Hassan
    Masrour, Tawfik
    KNOWLEDGE-BASED SYSTEMS, 2024, 284
  • [9] Convergence of batch gradient algorithm with smoothing composition of group l0 and l1/2 regularization for feedforward neural networks
    Ramchoun, Hassan
    Ettaouil, Mohamed
    PROGRESS IN ARTIFICIAL INTELLIGENCE, 2022, 11 (03) : 269 - 278
  • [10] IMAGE SMOOTHING VIA A NOVEL ADAPTIVE WEIGHTED L0 REGULARIZATION
    Zhao, Wufan
    Wu, Tingting
    Feng, Chenchen
    Wu, Wenna
    Lv, Xiaoguang
    Chen, Hongming
    Liu, Jun
    INTERNATIONAL JOURNAL OF NUMERICAL ANALYSIS AND MODELING, 2025, 22 (01) : 21 - 39