Modified MMS: Minimization Approach for Model Subset Selection

被引:0
|
作者
Rajathi, C. [1 ]
Rukmani, P. [1 ]
机构
[1] Vellore Inst Technol, Sch Comp Sci & Engn, Chennai 600127, India
来源
CMC-COMPUTERS MATERIALS & CONTINUA | 2023年 / 77卷 / 01期
关键词
Ensemble learning; intrusion detection; minimization; model diversity; DIFFERENTIAL VARIATIONAL-INEQUALITIES; GLOBAL ERROR-BOUNDS; GAP FUNCTIONS; HEMIVARIATIONAL INEQUALITIES; REGULARIZATION METHOD; PENALTY; EVOLUTION; ENSEMBLE; DRIVEN;
D O I
10.32604/cmc.2023.041507
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Considering the recent developments in the digital environment, ensuring a higher level of security for networking systems is imperative. Many security approaches are being constantly developed to protect against evolving threats. An ensemble model for the intrusion classification system yielded promising results based on the knowledge of many prior studies. This research work aimed to create a more diverse and effective ensemble model. To this Decision Tree (DT), Support Vector Machine (SVM), and Random Forest (RF) from existing study to run as independent models. Once the individual models were trained, a Correlation-Based Diversity Matrix (CDM) was created by determining their closeness. The models for the ensemble were chosen by the proposed Modified Minimization Approach for Model Subset Selection (Modified-MMS) from Lower triangular-CDM (L-CDM) as input. The proposed algorithm performance was assessed using the Network Security Laboratory-Knowledge Discovery in Databases (NSL-KDD) dataset, and several performance metrics, including accuracy, precision, recall, and F1-score. By selecting a diverse set of models, the proposed system enhances the performance of an ensemble by reducing overfitting and increasing prediction accuracy. The proposed work achieved an impressive accuracy of 99.26%, using only two classification models in an ensemble, which surpasses the performance of a larger ensemble that employs six classification models.
引用
收藏
页码:733 / 756
页数:24
相关论文
共 50 条
  • [31] Bayesian subset selection approach to ranking normal means
    Hamilton, Cody
    Bratcher, Tom L.
    Stamey, James D.
    JOURNAL OF APPLIED STATISTICS, 2008, 35 (08) : 847 - 851
  • [32] An efficient feature subset selection approach for machine learning
    Rincy, N. Thomas
    Gupta, Roopam
    MULTIMEDIA TOOLS AND APPLICATIONS, 2021, 80 (08) : 12737 - 12830
  • [33] Wrapper approach for feature subset selection using GA
    Zhou, Huilin
    Wu, Jianbin
    Wang, Yuhao
    Tian, Mao
    2007 INTERNATIONAL SYMPOSIUM ON INTELLIGENT SIGNAL PROCESSING AND COMMUNICATION SYSTEMS, VOLS 1 AND 2, 2007, : 224 - +
  • [34] A RANDOM-SIZE SUBSET APPROACH TO THE SELECTION PROBLEM
    LIU, W
    JOURNAL OF STATISTICAL PLANNING AND INFERENCE, 1995, 48 (02) : 153 - 164
  • [35] Social Impact based Approach to Feature Subset Selection
    Macas, Martin
    Lhotska, Lenka
    Kremen, Vaclav
    NATURE INSPIRED COOPERATIVE STRATEGIES FOR OPTIMIZATION (NICSO 2007), 2008, 129 : 239 - 248
  • [36] A DECISION-THEORETIC APPROACH TO SUBSET-SELECTION
    BJORNSTAD, JF
    COMMUNICATIONS IN STATISTICS PART A-THEORY AND METHODS, 1981, 10 (23): : 2411 - 2433
  • [37] THE COMBINATION OF FORECASTS - A RANKING AND SUBSET-SELECTION APPROACH
    DUONG, QP
    MATHEMATICAL AND COMPUTER MODELLING, 1989, 12 (09) : 1131 - 1143
  • [38] Feature subset selection: A correlation based filter approach
    Hall, MA
    Smith, LA
    PROGRESS IN CONNECTIONIST-BASED INFORMATION SYSTEMS, VOLS 1 AND 2, 1998, : 855 - 858
  • [39] An efficient feature subset selection approach for machine learning
    Thomas Rincy N
    Roopam Gupta
    Multimedia Tools and Applications, 2021, 80 : 12737 - 12830
  • [40] Channel Capacity Approach to Hyperspectral Band Subset Selection
    Chang, Chein-I
    Lee, Li-Chien
    Xue, Bai
    Song, Meiping
    Chen, Jian
    IEEE JOURNAL OF SELECTED TOPICS IN APPLIED EARTH OBSERVATIONS AND REMOTE SENSING, 2017, 10 (10) : 4630 - 4644