Large-Margin Regularized Softmax Cross-Entropy Loss

被引:30
|
作者
Li, Xiaoxu [1 ]
Chang, Dongliang [1 ]
Tian, Tao [1 ]
Cao, Jie [1 ]
机构
[1] Lanzhou Univ Technol, Sch Comp & Commun, Lanzhou 730050, Gansu, Peoples R China
基金
中国国家自然科学基金;
关键词
Neural networks; cross-entropy loss; large-margin regularization; NEURAL-NETWORKS; DEEP;
D O I
10.1109/ACCESS.2019.2897692
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Softmax cross-entropy loss with L2 regularization is commonly adopted in the machine learning and neural network community. Considering that the traditional softmax cross-entropy loss simply focuses on fitting or classifying the training data accurately but does not explicitly encourage a large decision margin for classification, some loss functions are proposed to improve the generalization performance by solving the problem. However, these loss functions enhance the difficulty of model optimization. In addition, inspired by regularized logistic regression, where the regularized term is responsible for adjusting the width of decision margin, which can be seen as an approximation of support vector machine, we proposed a large-margin regularization method for softmax cross-entropy loss. The advantages of the proposed loss are twofold as follows: the first is the generalization performance improvement, and the second is easy optimization. The experimental results on three small-sample datasets show that our regularization method achieves good performance and outperforms the existing popular regularization methods of neural networks.
引用
收藏
页码:19572 / 19578
页数:7
相关论文
共 50 条
  • [1] Large-Margin Softmax Loss for Convolutional Neural Networks
    Liu, Weiyang
    Wen, Yandong
    Yu, Zhiding
    Yang, Meng
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 48, 2016, 48
  • [2] IMPROVED LARGE-MARGIN SOFTMAX LOSS FOR SPEAKER DIARISATION
    Fathullah, Y.
    Zhang, C.
    Woodland, P. C.
    2020 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH, AND SIGNAL PROCESSING, 2020, : 7104 - 7108
  • [3] Integrate Receptive Field Block into Large-margin Softmax Loss for Face Recognition
    Wei, Yi
    Pu, Haibo
    Zhu, Yu
    Li, XiaoFan
    2019 3RD INTERNATIONAL CONFERENCE ON MACHINE VISION AND INFORMATION TECHNOLOGY (CMVIT 2019), 2019, 1229
  • [4] Investigation of Large-Margin Softmax in Neural Language Modeling
    Huo, Jingjing
    Gao, Yingbo
    Wang, Weiyue
    Schlueter, Ralf
    Ney, Hermann
    INTERSPEECH 2020, 2020, : 3645 - 3649
  • [5] Convolutional Neural Networks with Large-Margin Softmax Loss Function for Cognitive Load Recognition
    Liu, Yuetian
    Liu, Qingshan
    PROCEEDINGS OF THE 36TH CHINESE CONTROL CONFERENCE (CCC 2017), 2017, : 4045 - 4049
  • [6] Balanced Softmax Cross-Entropy for Incremental Learning
    Jodelet, Quentin
    Liu, Xin
    Murata, Tsuyoshi
    ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING - ICANN 2021, PT II, 2021, 12892 : 385 - 396
  • [7] Balanced softmax cross-entropy for incremental learning with and without memory
    Jodelet, Quentin
    Liu, Xin
    Murata, Tsuyoshi
    COMPUTER VISION AND IMAGE UNDERSTANDING, 2022, 225
  • [8] Large Margin Softmax Loss for Speaker Verification
    Liu, Yi
    He, Liang
    Liu, Jia
    INTERSPEECH 2019, 2019, : 2873 - 2877
  • [9] Advancing neural network calibration: The role of gradient decay in large-margin Softmax optimization
    Zhang, Siyuan
    Xie, Linbo
    NEURAL NETWORKS, 2024, 178
  • [10] Scalable Cross-Entropy Loss for Sequential Recommendations with Large Item Catalogs
    Mezentsev, Gleb
    Gusak, Danil
    Oseledets, Ivan
    Frolov, Evgeny
    PROCEEDINGS OF THE EIGHTEENTH ACM CONFERENCE ON RECOMMENDER SYSTEMS, RECSYS 2024, 2024, : 475 - 485