Multi-Label Learning with Distribution Matching Ensemble: An Adaptive and Just-In-Time Weighted Ensemble Learning Algorithm for Classifying a Nonstationary Online Multi-Label Data Stream

被引:1
|
作者
Shen, Chao [1 ]
Liu, Bingyu [1 ]
Shao, Changbin [1 ]
Yang, Xibei [1 ]
Xu, Sen [2 ]
Zhu, Changming [3 ]
Yu, Hualong [1 ]
机构
[1] Jiangsu Univ Sci & Technol, Sch Comp, Zhenjiang 212100, Peoples R China
[2] Yancheng Inst Technol, Sch Informat Technol, Yancheng 224051, Peoples R China
[3] Minzu Univ China, Key Lab Ethn language Intelligent Anal & Secur Gov, Beijing 100081, Peoples R China
来源
SYMMETRY-BASEL | 2025年 / 17卷 / 02期
基金
中国国家自然科学基金;
关键词
multi-label data stream; adaptive weighted ensemble; concept drift; distribution matching; Gaussian mixture model; Kullback-Leibler divergence; label distribution drift detection; CONCEPT DRIFT; CLASSIFICATION; MACHINE;
D O I
10.3390/sym17020182
中图分类号
O [数理科学和化学]; P [天文学、地球科学]; Q [生物科学]; N [自然科学总论];
学科分类号
07 ; 0710 ; 09 ;
摘要
Learning from a nonstationary data stream is challenging, as a data stream is generally considered to be endless, and the learning model is required to be constantly amended for adapting the shifting data distributions. When it meets multi-label data, the challenge would be further intensified. In this study, an adaptive online weighted multi-label ensemble learning algorithm called MLDME (multi-label learning with distribution matching ensemble) is proposed. It simultaneously calculates both the feature matching level and label matching level between any one reserved data block and the new received data block, further providing an adaptive decision weight assignment for ensemble classifiers based on their distribution similarities. Specifically, MLDME abandons the most commonly used but not totally correct underlying hypothesis that in a data stream, each data block always has the most approximate distribution with that emerging after it; thus, MLDME could provide a just-in-time decision for the new received data block. In addition, to avoid an infinite extension of ensemble classifiers, we use a fixed-size buffer to store them and design three different dynamic classifier updating rules. Experimental results for nine synthetic and three real-world multi-label nonstationary data streams indicate that the proposed MLDME algorithm is superior to some popular and state-of-the-art online learning paradigms and algorithms, including two specifically designed ones for classifying a nonstationary multi-label data stream.
引用
收藏
页数:25
相关论文
共 50 条
  • [41] An Ensemble Deep Learning Architecture for Multi-label Classification on TI-RADS
    Duan, Xueli
    Duan, Shaobo
    Jiang, Pei
    Li, Runzhi
    Zhang, Ye
    Ma, Jingzhe
    Zhao, Hongling
    Dai, Honghua
    2020 IEEE INTERNATIONAL CONFERENCE ON BIOINFORMATICS AND BIOMEDICINE, 2020, : 576 - 582
  • [42] Adaptive Model over a Multi-label Streaming Data Experimental Study over Stream Multi-label Classification
    ALattas, Amani M.
    2018 21ST SAUDI COMPUTER SOCIETY NATIONAL COMPUTER CONFERENCE (NCC), 2018,
  • [43] Active Learning Algorithms for Multi-label Data
    Cherman, Everton Alvares
    Tsoumakas, Grigorios
    Monard, Maria-Carolina
    ARTIFICIAL INTELLIGENCE APPLICATIONS AND INNOVATIONS, AIAI 2016, 2016, 475 : 267 - 279
  • [44] Imbalance multi-label data learning with label specific features
    Rastogi, Reshma
    Mortaza, Sayed
    NEUROCOMPUTING, 2022, 513 : 395 - 408
  • [45] Imbalance multi-label data learning with label specific features
    Rastogi, Reshma
    Mortaza, Sayed
    Neurocomputing, 2022, 513 : 395 - 408
  • [46] Towards the Learning of Weighted Multi-label Associative Classifiers
    Liu, Chunyang
    Chen, Ling
    Tsang, Ivor
    Yin, Hongzhi
    2018 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2018,
  • [47] Matching Neural Network for Extreme Multi-Label Learning
    Zhao, Zhiyun
    Li, Fengzhi
    Zuo, Yuan
    Wu, Junjie
    4TH INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE APPLICATIONS AND TECHNOLOGIES (AIAAT 2020), 2020, 1642
  • [48] Multi-label learning with label-specific features via weighting and label entropy guided clustering ensemble
    Zhang, Chunyu
    Li, Zhanshan
    NEUROCOMPUTING, 2021, 419 (419) : 59 - 69
  • [49] An Imbalanced Multi-Label Data Ensemble Learning Method Based on Safe Under-Sampling
    Sun, Zhong-Bin
    Diao, Yu-Xuan
    Ma, Su-Yang
    Tien Tzu Hsueh Pao/Acta Electronica Sinica, 2024, 52 (10): : 3392 - 3408
  • [50] Multi-modal multi-label semantic indexing of images based on hybrid ensemble learning
    Li, Wei
    Sun, Maosong
    Habel, Christopher
    ADVANCES IN MULTIMEDIA INFORMATION PROCESSING - PCM 2007, 2007, 4810 : 744 - +