Feature Selection with L1 Regularization in Formal Neurons

被引:0
|
作者
Bobrowski, Leon [1 ,2 ]
机构
[1] Bialystok Tech Univ, Fac Comp Sci, Wiejska 45A, Bialystok, Poland
[2] Inst Biocybernet & Biomed Engn, PAS, Warsaw, Poland
关键词
high-dimensional data sets; formal neurons with a margin; feature selection; CPL criterion functions; L-1; regularization;
D O I
10.1007/978-3-031-62495-7_26
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Designing classifiers on high-dimensional learning data sets is an important task that appears in artificial intelligence applications. Designing classifiers for high-dimensional data involves learning hierarchical neural networks combined with feature selection. Feature selection aims to omit features that are unnecessary for a given problem. Feature selection in formal meurons can be achieved by minimizing convex and picewise linear (CPL) criterion functions with L-1 regularization. Minimizing CPL criterion functions can be associated with computations on a finite number of vertices in the parameter space.
引用
收藏
页码:343 / 353
页数:11
相关论文
共 50 条
  • [21] Orbital minimization method with l1 regularization
    Lu, Jianfeng
    Thicke, Kyle
    JOURNAL OF COMPUTATIONAL PHYSICS, 2017, 336 : 87 - 103
  • [22] Stochastic PCA with l2 and l1 Regularization
    Mianjy, Poorya
    Arora, Raman
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 80, 2018, 80
  • [23] αl1 - βl2 regularization for sparse recovery
    Ding, Liang
    Han, Weimin
    INVERSE PROBLEMS, 2019, 35 (12)
  • [24] ELM with L1/L2 regularization constraints
    Feng B.
    Qin K.
    Jiang Z.
    Hanjie Xuebao/Transactions of the China Welding Institution, 2018, 39 (09): : 31 - 35
  • [25] Iterative L1/2 Regularization Algorithm for Variable Selection in the Cox Proportional Hazards Model
    Liu, Cheng
    Liang, Yong
    Luan, Xin-Ze
    Leung, Kwong-Sak
    Chan, Tak-Ming
    Xu, Zong-Ben
    Zhang, Hai
    ADVANCES IN SWARM INTELLIGENCE, ICSI 2012, PT II, 2012, 7332 : 11 - 17
  • [26] Latent Variable Selection for Multidimensional Item Response Theory Models via L1 Regularization
    Sun, Jianan
    Chen, Yunxiao
    Liu, Jingchen
    Ying, Zhiliang
    Xin, Tao
    PSYCHOMETRIKA, 2016, 81 (04) : 921 - 939
  • [27] Gene Selection in Cancer Classification Using Sparse Logistic Regression with L1/2 Regularization
    Wu, Shengbing
    Jiang, Hongkun
    Shen, Haiwei
    Yang, Ziyi
    APPLIED SCIENCES-BASEL, 2018, 8 (09):
  • [28] Feature Selection With l2,1-2 Regularization
    Shi, Yong
    Miao, Jianyu
    Wang, Zhengyu
    Zhang, Peng
    Niu, Lingfeng
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2018, 29 (10) : 4967 - 4982
  • [29] Compact Deep Neural Networks with l1,1 and l1,2 Regularization
    Ma, Rongrong
    Niu, Lingfeng
    2018 18TH IEEE INTERNATIONAL CONFERENCE ON DATA MINING WORKSHOPS (ICDMW), 2018, : 1248 - 1254
  • [30] Parameter choices for sparse regularization with the l1 norm
    Liu, Qianru
    Wang, Rui
    Xu, Yuesheng
    Yan, Mingsong
    INVERSE PROBLEMS, 2023, 39 (02)