Extremely Randomized Forest with Hierarchy of Multi-label Classifiers

被引:0
|
作者
Li, Jinxia [1 ]
Zheng, Yihan [2 ]
Han, Chao [2 ]
Wu, Qingyao [2 ,3 ]
Chen, Jian [2 ]
机构
[1] Hebei Univ Econ & Business, Comp Ctr, Shijiazhuang 300000, Hebei, Peoples R China
[2] South China Univ Technol, Sch Software Engn, Guangzhou 510006, Guangdong, Peoples R China
[3] Chinese Acad Sci, Inst Software, State Key Lab Comp Sci, Beijing, Peoples R China
基金
中国国家自然科学基金;
关键词
Multi-label classification; Random forest; Hierarchy of classifiers;
D O I
10.1007/978-3-319-67777-4_40
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Hierarchy Of multi-label classifiERs (HOMER) is one of the most popular multi-label classification approaches. However, it is limited in its applicability to large-scale problems due to the high computational complexity when building the hierarchical model. In this paper, we propose a novel approach, called Extremely Randomized Forest with Hierarchy of multi-label classifiers (ERF-H), to effectively construct an ensemble of randomized HOMER trees for multi-label classification. In ERF-H, we randomly chose data samples with replacement from the original dataset for each HOMER tree. We constructed HOMER trees by clustering labels to split each hierarchy of nodes and learns a local multi-label classifier at every node. Extensive experiments show the effectiveness and efficiency of our approach compared to the state-of-the-art multi-label classification methods.
引用
收藏
页码:450 / 460
页数:11
相关论文
共 50 条
  • [41] Designing multi-label classifiers that maximize F measures: State of the art
    Pillai, Ignazio
    Fumera, Giorgio
    Roli, Fabio
    PATTERN RECOGNITION, 2017, 61 : 394 - 404
  • [42] Single-label and multi-label conceptor classifiers in pre-trained neural networks
    Guangwu Qian
    Lei Zhang
    Yan Wang
    Neural Computing and Applications, 2019, 31 : 6179 - 6188
  • [43] Spatial Consistency Loss for Training Multi-Label Classifiers from Single-Label Annotations
    Verelst, Thomas
    Rubenstein, Paul K.
    Eichner, Marcin
    Tuytelaars, Tinne
    Berman, Maxim
    2023 IEEE/CVF WINTER CONFERENCE ON APPLICATIONS OF COMPUTER VISION (WACV), 2023, : 3868 - 3878
  • [44] Single-label and multi-label conceptor classifiers in pre-trained neural networks
    Qian, Guangwu
    Zhang, Lei
    Wang, Yan
    NEURAL COMPUTING & APPLICATIONS, 2019, 31 (10): : 6179 - 6188
  • [45] Hierarchy exploitation to detect missing annotations on hierarchical multi-label classification
    Romero, Miguel
    Nakano, Felipe Kenji
    Finke, Jorge
    Rocha, Camilo
    Vens, Celine
    arXiv, 2022,
  • [46] Early and extremely early multi-label fault diagnosis in induction motors
    Juez-Gil, Mario
    Jose Saucedo-Dorantes, Juan
    Arnaiz-Gonzalez, Alvar
    Lopez-Nozal, Carlos
    Garcia-Osorio, Cesar
    Lowe, David
    ISA TRANSACTIONS, 2020, 106 : 367 - 381
  • [47] Training Methods of Multi-Label Prediction Classifiers for Hyperspectral Remote Sensing Images
    Haidar, Salma
    Oramas, Jose
    REMOTE SENSING, 2023, 15 (24)
  • [48] The Importance of Calibration: Rethinking Confidence and Performance of Speech Multi-label Emotion Classifiers
    Chou, Huang-Cheng
    Goncalves, Lucas
    Leem, Seong-Gyun
    Lee, Chi-Chun
    Busso, Carlos
    INTERSPEECH 2023, 2023, : 641 - 645
  • [49] AdaBoost.C2: Boosting Classifiers Chains for Multi-Label Classification
    Li, Jiaxuan
    Zhu, Xiaoyan
    Wang, Jiayin
    THIRTY-SEVENTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, VOL 37 NO 7, 2023, : 8580 - 8587
  • [50] Decision functions for chain classifiers based on Bayesian networks for multi-label classification
    Varando, Gherardo
    Bielza, Concha
    Larranaga, Pedro
    INTERNATIONAL JOURNAL OF APPROXIMATE REASONING, 2016, 68 : 164 - 178