Multi-armed bandit based online model selection for concept-drift adaptation

被引:0
|
作者
Wilson, Jobin [1 ,2 ]
Chaudhury, Santanu [2 ,3 ]
Lall, Brejesh [2 ]
机构
[1] Flytxt, R&D Dept, Trivandrum, Kerala, India
[2] Indian Inst Technol Delhi, Dept Elect Engn, New Delhi, India
[3] Indian Inst Technol Jodhpur, Dept Comp Sci & Engn, Jodhpur, India
关键词
concept-drift; ensemble methods; model selection; multi-armed bandits; CLASSIFICATION; FRAMEWORK;
D O I
10.1111/exsy.13626
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Ensemble methods are among the most effective concept-drift adaptation techniques due to their high learning performance and flexibility. However, they are computationally expensive and pose a challenge in applications involving high-speed data streams. In this paper, we present a computationally efficient heterogeneous classifier ensemble entitled OMS-MAB which uses online model selection for concept-drift adaptation by posing it as a non-stationary multi-armed bandit (MAB) problem. We use a MAB to select a single adaptive learner within the ensemble for learning and prediction while systematically exploring promising alternatives. Each ensemble member is made drift resistant using explicit drift detection and is represented as an arm of the MAB. An exploration factor & varepsilon;$$ \upvarepsilon $$ controls the trade-off between predictive performance and computational resource requirements, eliminating the need to continuously train and evaluate all the ensemble members. A rigorous evaluation on 20 benchmark datasets and 9 algorithms indicates that the accuracy of OMS-MAB is statistically at par with state-of-the-art (SOTA) ensembles. Moreover, it offers a significant reduction in execution time and model size in comparison to several SOTA ensemble methods, making it a promising ensemble for resource constrained stream-mining problems.
引用
收藏
页数:25
相关论文
共 50 条
  • [1] An Online Kernel Selection Wrapper via Multi-Armed Bandit Model
    Li, Junfan
    Liao, Shizhong
    2018 24TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION (ICPR), 2018, : 1307 - 1312
  • [2] muMAB: A Multi-Armed Bandit Model for Wireless Network Selection
    Boldrini, Stefano
    De Nardis, Luca
    Caso, Giuseppe
    Le, Mai T. P.
    Fiorina, Jocelyn
    Di Benedetto, Maria-Gabriella
    ALGORITHMS, 2018, 11 (02)
  • [3] A Subsequent Speaker Selection Method for Online Discussions Based on the Multi-armed Bandit Algorithm
    Kurii, Mio
    Fujita, Katsuhide
    PRICAI 2018: TRENDS IN ARTIFICIAL INTELLIGENCE, PT II, 2018, 11013 : 404 - 411
  • [4] A Multi-Armed Bandit Strategy for Countermeasure Selection
    Cochrane, Madeleine
    Hunjet, Robert
    2020 IEEE SYMPOSIUM SERIES ON COMPUTATIONAL INTELLIGENCE (SSCI), 2020, : 2510 - 2515
  • [5] Contextual Multi-Armed Bandit-Based Link Adaptation for URLLC
    Ku, Sungmo
    Lee, Chungyong
    IEEE TRANSACTIONS ON VEHICULAR TECHNOLOGY, 2024, 73 (11) : 17305 - 17315
  • [6] Online Optimization Algorithms for Multi-Armed Bandit Problem
    Kamalov, Mikhail
    Dobrynin, Vladimir
    Balykina, Yulia
    2017 CONSTRUCTIVE NONSMOOTH ANALYSIS AND RELATED TOPICS (DEDICATED TO THE MEMORY OF V.F. DEMYANOV) (CNSA), 2017, : 141 - 143
  • [7] Research on Modelling Single Keyword Selection Based on Multi-armed Bandit
    Zhou, Baojian
    Qi, Wei
    Chen, Ligang
    2ND INTERNATIONAL CONFERENCE ON COMMUNICATION AND TECHNOLOGY (ICCT 2015), 2015, : 266 - 273
  • [8] Multi-Armed Bandit-Based User Network Node Selection
    Gao, Qinyan
    Xie, Zhidong
    SENSORS, 2024, 24 (13)
  • [9] The multi-armed bandit, with constraints
    Eric V. Denardo
    Eugene A. Feinberg
    Uriel G. Rothblum
    Annals of Operations Research, 2013, 208 : 37 - 62
  • [10] The multi-armed bandit, with constraints
    Denardo, Eric V.
    Feinberg, Eugene A.
    Rothblum, Uriel G.
    ANNALS OF OPERATIONS RESEARCH, 2013, 208 (01) : 37 - 62