Deep Feature Selection using an Enhanced Sparse Group Lasso Algorithm

被引:0
|
作者
Farokhmanesh, Fatemeh [1 ]
Sadeghi, Mohammad Taghi [1 ]
机构
[1] Yazd Univ, Dept Elect Engn, Yazd, Iran
关键词
feature selection; lasso; sparse representation; deep learning; REGRESSION;
D O I
10.1109/iraniancee.2019.8786386
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Feature selection is an important method of data dimensionality reduction widely used in machine learning. In this framework, the sparse representation based feature selection methods are very attractive. This is because of the nature of these methods which try to represent a data with as less as possible non-zero coefficients. In deep neural networks, a very high dimensional feature space is usually existed. In such a situation, one can take advantages of the feature selection approaches into account. In this paper, first, three sparse feature selection methods are compared. The Sparse Group Lasso (SGL) algorithm is one of the adopted approaches. This method is theoretically very well-organized and leads to good results for man-made features. The most important property of this method is that it highly induces the sparsity to the data. A main step of the SGL method is the features grouping step. In this paper, a k-means clustering based method is applied for grouping of the features. Our experimental results show that this sparse representation based method leads to very successful results in deep neural networks.
引用
收藏
页码:1549 / 1552
页数:4
相关论文
共 50 条
  • [21] Fused lasso for feature selection using structural information
    Cui, Lixin
    Bai, Lu
    Wang, Yue
    Yu, Philip S.
    Hancock, Edwin R.
    PATTERN RECOGNITION, 2021, 119
  • [22] Efficient Feature Selection for Prediction of Diabetic Using LASSO
    Kumarage, Prabha M.
    Yogarajah, B.
    Ratnarajah, Nagulan
    2019 19TH INTERNATIONAL CONFERENCE ON ADVANCES IN ICT FOR EMERGING REGIONS (ICTER - 2019), 2019,
  • [23] BP Neural Network Feature Selection Based on Group Lasso Regularization
    Liu, Tiqian
    Xiao, Jiang-Wen
    Huang, Zhengyi
    Kong, Erdan
    Liang, Yuntao
    2019 CHINESE AUTOMATION CONGRESS (CAC2019), 2019, : 2786 - 2790
  • [24] Group Lasso Regularized Multiple Kernel Learning for Heterogeneous Feature Selection
    Yeh, Yi-Ren
    Chung, Yung-Yu
    Lin, Ting-Chu
    Wang, Yu-Chiang Frank
    2011 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2011, : 2570 - 2577
  • [25] Sparse Neural Additive Model: Interpretable Deep Learning with Feature Selection via Group Sparsity
    Xu, Shiyun
    Bu, Zhiqi
    Chaudhari, Pratik
    Barnett, Ian J.
    MACHINE LEARNING AND KNOWLEDGE DISCOVERY IN DATABASES: RESEARCH TRACK, ECML PKDD 2023, PT III, 2023, 14171 : 343 - 359
  • [26] Feature Selection With Group-Sparse Stochastic Gates
    Park, Hyeryn
    Lee, Changhee
    IEEE ACCESS, 2024, 12 : 102299 - 102312
  • [27] Feature selection algorithm based on mutual information and lasso for microarray data
    Zhongxin W.
    Gang S.
    Jing Z.
    Jia Z.
    Gang, Sun (ahfysungang@163.com), 1600, Bentham Science Publishers B.V., P.O. Box 294, Bussum, 1400 AG, Netherlands (10): : 278 - 286
  • [28] Feature Selection Tracking Algorithm Based on Sparse Representation
    Lou, Hui-dong
    Li, Wei-guang
    Hou, Yue-en
    Yao, Qing-he
    Ye, Guo-qiang
    Wan, Hao
    MATHEMATICAL PROBLEMS IN ENGINEERING, 2015, 2015
  • [29] Unsupervised Feature Selection Algorithm Based on Sparse Representation
    Cui, Guoqing
    Yang, Jie
    Zareapoor, Masoumeh
    Wang, Jiechen
    2016 3RD INTERNATIONAL CONFERENCE ON SYSTEMS AND INFORMATICS (ICSAI), 2016, : 1028 - 1033
  • [30] Consistency of Sparse-Group Lasso Graphical Model Selection for Time Series
    Tugnait, Jitendra K.
    2020 54TH ASILOMAR CONFERENCE ON SIGNALS, SYSTEMS, AND COMPUTERS, 2020, : 589 - 593