Information gain directed genetic algorithm wrapper feature selection for credit rating

被引:214
|
作者
Jadhav, Swati [1 ]
He, Hongmei [1 ]
Jenkins, Karl [1 ]
机构
[1] Cranfield Univ, Sch Aerosp Transport & Mfg, Cranfield MK43 0AL, Beds, England
关键词
Feature selection; Genetic algorithm in wrapper; Support vector machine; K nearest neighbour clustering; Naive Bayes classifier; Information gain; Credit scoring; Accuracy; ROC curve; SUPPORT VECTOR MACHINES; SWARM OPTIMIZATION; CLASSIFICATION; HYBRID; COMBINATION; MODEL; SVM; SET;
D O I
10.1016/j.asoc.2018.04.033
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Financial credit scoring is one of the crucial processes in the finance industry sector to be able to assess the credit-worthiness of individuals and enterprises. Various statistics-based machine learning techniques have been employed for this task. "Curse of Dimensionality" is still a significant challenge in machine learning techniques. Some research has been carried out on Feature Selection (FS) using genetic algorithm as wrapper to improve the performance of credit scoring models. However, the challenge lies in finding an overall best method in credit scoring problems and improving the time-consuming process of feature selection. In this study, the credit scoring problem is investigated through feature selection to improve classification performance. This work proposes a novel approach to feature selection in credit scoring applications, called as Information Gain Directed Feature Selection algorithm (IGDFS), which performs the ranking of features based on information gain, propagates the top in features through the GA wrapper (GAW) algorithm using three classical machine learning algorithms of KNN, Naive Bayes and Support Vector Machine (SVM) for credit scoring. The first stage of information gain guided feature selection can help reduce the computing complexity of GA wrapper, and the information gain of features selected with the IGDFS can indicate their importance to decision making. Regarding the classification accuracy, SVM accuracy is always better than KNN and NB for Baseline techniques, GAW and IGDFS. Also, we can conclude that the IGDFS achieved better performance than generic GAW, and GAW obtained better performance than the corresponding single classifiers (baseline) for almost all cases, except for the German Credit dataset, IGDFS + KNN has worse performance than generic GAW and the single classifier KNN. Removing features with low information gain could produce conflict with the original data structure for KNN, and thus affect the performance of IGDFS + KNN. Regarding the ROC performance, for the German Credit Dataset, the three classic machine learning algorithms, SVM, KNN and Naive Bayes in the wrapper of IGDFS GA obtained almost the same performance. For the Australian credit dataset and the Taiwan Credit dataset, the IGDFS + Naive Bayes achieved the largest area under ROC curves. (C) 2018 Elsevier B.V. All rights reserved.
引用
收藏
页码:541 / 553
页数:13
相关论文
共 50 条
  • [41] Information gain-based multi-objective evolutionary algorithm for feature selection
    Zhang, Baohang
    Wang, Ziqian
    Li, Haotian
    Lei, Zhenyu
    Cheng, Jiujun
    Gao, Shangce
    INFORMATION SCIENCES, 2024, 677
  • [42] A two-stage feature selection method for text categorization by using information gain, principal component analysis and genetic algorithm
    Uguz, Harun
    KNOWLEDGE-BASED SYSTEMS, 2011, 24 (07) : 1024 - 1032
  • [43] Discovery of significant porcine SNPs for swine breed identification by a hybrid of information gain, genetic algorithm, and frequency feature selection technique
    Pasupa, Kitsuchart
    Rathasamuth, Wanthanee
    Tongsima, Sissades
    BMC BIOINFORMATICS, 2020, 21 (01)
  • [44] Discovery of significant porcine SNPs for swine breed identification by a hybrid of information gain, genetic algorithm, and frequency feature selection technique
    Kitsuchart Pasupa
    Wanthanee Rathasamuth
    Sissades Tongsima
    BMC Bioinformatics, 21
  • [45] A hybrid system with filter approach and multiple population genetic algorithm for feature selection in credit scoring
    Wang, Di
    Zhang, Zuoquan
    Bai, Rongquan
    Mao, Yanan
    JOURNAL OF COMPUTATIONAL AND APPLIED MATHEMATICS, 2018, 329 : 307 - 321
  • [46] Island Model Genetic Algorithm for Feature Selection in Non-Traditional Credit Risk Evaluation
    Liu, Yue
    Ghandar, Adam
    Theodoropoulos, Georgios
    2019 IEEE CONGRESS ON EVOLUTIONARY COMPUTATION (CEC), 2019, : 2771 - 2778
  • [47] Genetic Programming as a Feature Selection Algorithm
    Suarez, Ranyart R.
    Maria Valencia-Ramirez, Jose
    Graff, Mario
    2014 IEEE INTERNATIONAL AUTUMN MEETING ON POWER, ELECTRONICS AND COMPUTING (ROPEC), 2014,
  • [48] Wrapper-Based Feature Subset Selection for Rapid Image Information Mining
    Durbha, Surya S.
    King, Roger L.
    Younan, Nicolas H.
    IEEE GEOSCIENCE AND REMOTE SENSING LETTERS, 2010, 7 (01) : 43 - 47
  • [49] Wrapper Feature Subset Selection for Dimension Reduction Based on Ensemble Learning Algorithm
    Panthong, Rattanawadee
    Srivihok, Anongnart
    THIRD INFORMATION SYSTEMS INTERNATIONAL CONFERENCE 2015, 2015, 72 : 162 - 169
  • [50] A novel multi-objective forest optimization algorithm for wrapper feature selection
    Nouri-Moghaddam, Babak
    Ghazanfari, Mehdi
    Fathian, Mohammad
    EXPERT SYSTEMS WITH APPLICATIONS, 2021, 175