Meta-Scaler: A Meta-Learning Framework for the Selection of Scaling Techniques

被引:1
|
作者
de Amorim, Lucas B. V. [1 ,2 ]
Cavalcanti, George D. C. [1 ]
Cruz, Rafael M. O. [3 ]
机构
[1] Univ Fed Pernambuco, Ctr Informat, BR-50670901 Recife, Brazil
[2] Univ Fed Alagoas, Inst Computacao, BR-57072900 Maceio, Brazil
[3] Univ Quebec, Ecole Technol Super, Montreal, PQ H3C 3J7, Canada
关键词
Classification; meta-learning (MtL); normalization; scaling techniques (STs); DYNAMIC CLASSIFIER SELECTION; DATA COMPLEXITY; AUTOMATIC RECOMMENDATION; ALGORITHM SELECTION; ENSEMBLE SELECTION; RANKING; NORMALIZATION; COMBINATION; ACCURACY; FEATURES;
D O I
10.1109/TNNLS.2024.3366615
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Dataset scaling, a.k.a. normalization, is an essential preprocessing step in a machine learning (ML) pipeline. It aims to adjust the scale of attributes in a way that they all vary within the same range. This transformation is known to improve the performance of classification models. Still, there are several scaling techniques (STs) to choose from, and no ST is guaranteed to be the best for a dataset regardless of the classifier chosen. It is thus a problem-and classifier-dependent decision. Furthermore, there can be a huge difference in performance when selecting the wrong technique; hence, it should not be neglected. That said, the trial-and-error process of finding the most suitable technique for a particular dataset can be unfeasible. As an alternative, we propose the Meta-scaler, which uses meta-learning (MtL) to build meta-models to automatically select the best ST for a given dataset and classification algorithm. The meta-models learn to represent the relationship between meta-features extracted from the datasets and the performance of specific classification algorithms on these datasets when scaled with different techniques. Our experiments using 12 base classifiers, 300 datasets, and five STs demonstrate the feasibility and effectiveness of the approach. When using the ST selected by the Meta-scaler for each dataset, 10 of 12 base models tested achieved statistically significantly better classification performance than any fixed choice of a single ST. The Meta-scaler also outperforms state-of-the-art MtL approaches for ST selection. The source code, data, and results from the experiments in this article are available at a GitHub repository (http://github.com/amorimlb/meta_scaler).
引用
收藏
页码:1 / 0
页数:15
相关论文
共 50 条
  • [1] Evolutional Meta-Learning Framework for Automatic Classifier Selection
    Cacoveanu, Silviu
    Vidrighin, Camelia
    Potolea, Rodica
    2009 IEEE 5TH INTERNATIONAL CONFERENCE ON INTELLIGENT COMPUTER COMMUNICATION AND PROCESSING, PROCEEDINGS, 2009, : 27 - 30
  • [2] META-DES: A dynamic ensemble selection framework using meta-learning
    Cruz, Rafael M. O.
    Sabourin, Robert
    Cavalcanti, George D. C.
    Ren, Tsang Ing
    PATTERN RECOGNITION, 2015, 48 (05) : 1925 - 1935
  • [3] On Meta-Learning for Dynamic Ensemble Selection
    Cruz, Rafael M. O.
    Sabourin, Robert
    Cavalcanti, George D. C.
    2014 22ND INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION (ICPR), 2014, : 1230 - 1235
  • [4] A Meta-learning Framework for Bankruptcy Prediction
    Tsai, Chih-Fong
    Hsu, Yu-Feng
    JOURNAL OF FORECASTING, 2013, 32 (02) : 167 - 179
  • [5] EFFECT: Explainable framework for meta-learning in automatic classification algorithm selection
    Shao, Xinyue
    Wang, Hongzhi
    Zhu, Xiao
    Xiong, Feng
    Mu, Tianyu
    Zhang, Yan
    INFORMATION SCIENCES, 2023, 622 : 211 - 234
  • [6] A Case-Based Meta-Learning and Reasoning Framework for Classifiers Selection
    Ali, Rahman
    Khatak, Aasad Masood
    Chow, Francis
    Lee, Sungyoung
    PROCEEDINGS OF THE 12TH INTERNATIONAL CONFERENCE ON UBIQUITOUS INFORMATION MANAGEMENT AND COMMUNICATION (IMCOM 2018), 2018,
  • [7] A Collaborative Learning Framework via Federated Meta-Learning
    Lin, Sen
    Yang, Guang
    Zhang, Junshan
    2020 IEEE 40TH INTERNATIONAL CONFERENCE ON DISTRIBUTED COMPUTING SYSTEMS (ICDCS), 2020, : 289 - 299
  • [8] MetaLIRS: Meta-learning for Imputation and Regression Selection
    Erez, Ill Baysal
    Flokstra, Jan
    Pod, Mannes
    van Keulen, Maurice
    INTELLIGENT DATA ENGINEERING AND AUTOMATED LEARNING - IDEAL 2024, PT I, 2025, 15346 : 155 - 166
  • [9] A meta-learning based distribution system load forecasting model selection framework
    Li, Yiyan
    Zhang, Si
    Hu, Rongxing
    Lu, Ning
    APPLIED ENERGY, 2021, 294
  • [10] A review on preprocessing algorithm selection with meta-learning
    Pedro B. Pio
    Adriano Rivolli
    André C. P. L. F. de Carvalho
    Luís P. F. Garcia
    Knowledge and Information Systems, 2024, 66 (1) : 1 - 28