Cost-sensitive classifier chains: Selecting low-cost features in multi-label classification

被引:19
|
作者
Teisseyre, Pawel [1 ]
Zufferey, Damien [2 ]
Slomka, Marta [3 ]
机构
[1] Polish Acad Sci, Inst Comp Sci, Jana Kazimierza 5, PL-01248 Warsaw, Poland
[2] BAO Syst, Washington, DC USA
[3] Polish Acad Sci, Mossakowski Med Res Ctr, Warsaw, Poland
关键词
Multi-label classification; Cost-sensitive feature selection; Classifier chains; Logistic regression; Stability; Generalization error bounds;
D O I
10.1016/j.patcog.2018.09.012
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Feature selection is one of the trending challenges in multi-label classification. In recent years a lot of methods have been proposed. However the existing approaches assume that all the features have the same cost. This assumption may be inappropriate when the acquisition of the feature values is costly. For example in medical diagnosis each diagnostic value extracted by a clinical test is associated with its own cost. In such cases it may be better to choose a model with an acceptable classification performance but a much lower cost. We propose a novel method which incorporates the feature cost information into the learning process. The method, named Cost-Sensitive Classifier Chains, combines classifier chains and penalized logistic regression with a modified elastic-net penalty which takes into account costs of the features. We prove the stability and provide a bound on generalization error of our algorithm. We also propose the adaptive version in which penalty factors are changing during fitting the consecutive models in the chain. The methods are applied on real datasets: MIMIC-II and Hepatitis for which the cost information is provided by experts. Moreover, we propose an experimental framework in which the features are observed with measurement errors and the costs depend on the quality of the features. The framework allows to compare the cost-sensitive methods on benchmark datasets for which the cost information is not provided. The proposed method can be recommended in a situation when one wants to balance low costs and high prediction performance. (C) 2018 Elsevier Ltd. All rights reserved.
引用
收藏
页码:290 / 319
页数:30
相关论文
共 50 条
  • [1] Cost-sensitive label embedding for multi-label classification
    Huang, Kuan-Hao
    Lin, Hsuan-Tien
    MACHINE LEARNING, 2017, 106 (9-10) : 1725 - 1746
  • [2] Cost-sensitive label embedding for multi-label classification
    Kuan-Hao Huang
    Hsuan-Tien Lin
    Machine Learning, 2017, 106 : 1725 - 1746
  • [3] Multi-label thresholding for cost-sensitive classification
    Alotaibi, Reem
    Flach, Peter
    NEUROCOMPUTING, 2021, 436 : 232 - 247
  • [4] Condensed Filter Tree for Cost-Sensitive Multi-Label Classification
    Li, Chun-Liang
    Lin, Hsuan-Tien
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 32 (CYCLE 1), 2014, 32
  • [5] Multi-label Classification with Feature-aware Cost-sensitive Label Embedding
    Chiu, Hsien-Chun
    Lin, Hsuan-Tien
    2018 CONFERENCE ON TECHNOLOGIES AND APPLICATIONS OF ARTIFICIAL INTELLIGENCE (TAAI), 2018, : 40 - 45
  • [6] Cost-sensitive ensemble learning algorithm for multi-label classification problems
    Fu, Z.-L. (fzliang@netease.com), 1600, Science Press (40):
  • [7] Dynamic principal projection for cost-sensitive online multi-label classification
    Hong-Min Chu
    Kuan-Hao Huang
    Hsuan-Tien Lin
    Machine Learning, 2019, 108 : 1193 - 1230
  • [8] Dynamic principal projection for cost-sensitive online multi-label classification
    Chu, Hong-Min
    Huang, Kuan-Hao
    Lin, Hsuan-Tien
    MACHINE LEARNING, 2019, 108 (8-9) : 1193 - 1230
  • [9] GROUP SENSITIVE CLASSIFIER CHAINS FOR MULTI-LABEL CLASSIFICATION
    Huang, Jun
    Li, Guorong
    Wang, Shuhui
    Zhang, Weigang
    Huang, Qingming
    2015 IEEE INTERNATIONAL CONFERENCE ON MULTIMEDIA & EXPO (ICME), 2015,
  • [10] Cost-sensitive Encoding for Label Space Dimension Reduction Algorithms on Multi-label Classification
    Lo, Kuo-Hsuan
    Lin, Hsuan-Tien
    2017 CONFERENCE ON TECHNOLOGIES AND APPLICATIONS OF ARTIFICIAL INTELLIGENCE (TAAI), 2017, : 136 - 141