Task-Free Continual Learning via Online Discrepancy Distance Learning

被引:0
|
作者
Ye, Fei [1 ]
Bors, Adrian G. [1 ]
机构
[1] Univ York, Dept Comp Sci, York YO10 5GH, England
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Learning from non-stationary data streams, also called Task-Free Continual Learning (TFCL) remains challenging due to the absence of explicit task information in most applications. Even though recently some algorithms have been proposed for TFCL, these methods lack theoretical guarantees. Moreover, there are no theoretical studies about forgetting during TFCL. This paper develops a new theoretical analysis framework that derives generalization bounds based on the discrepancy distance between the visited samples and the entire information made available for training the model. This analysis provides new insights into the forgetting behaviour in classification tasks. Inspired by this theoretical model, we propose a new approach enabled with the dynamic component expansion mechanism for a mixture model, namely Online Discrepancy Distance Learning (ODDL). ODDL estimates the discrepancy between the current memory and the already accumulated knowledge as an expansion signal aiming to ensure a compact network architecture with optimal performance. We then propose a new sample selection approach that selectively stores the samples into the memory buffer through the discrepancy-based measure, further improving the performance. We perform several TFCL experiments with the proposed methodology, which demonstrate that the proposed approach achieves the state of the art performance.
引用
收藏
页数:14
相关论文
共 50 条
  • [1] Task-Free Continual Learning
    Aljundi, Rahaf
    Kelchtermans, Klaas
    Tuytelaars, Tinne
    2019 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR 2019), 2019, : 11246 - 11255
  • [2] Online industrial fault prognosis in dynamic environments via task-free continual learning
    Liu, Chongdang
    Zhang, Linxuan
    Zheng, Yimeng
    Jiang, Zhengyi
    Zheng, Jinghao
    Wu, Cheng
    NEUROCOMPUTING, 2024, 598
  • [3] Online Task-free Continual Learning with Dynamic Sparse Distributed Memory
    Pourcel, Julien
    Ngoc-Son Vu
    French, Robert M.
    COMPUTER VISION, ECCV 2022, PT XXV, 2022, 13685 : 739 - 756
  • [4] LEARNING AN EVOLVED MIXTURE MODEL FOR TASK-FREE CONTINUAL LEARNING
    Ye, Fei
    Bors, Adrian G.
    2022 IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING, ICIP, 2022, : 1936 - 1940
  • [5] Gradient-based Editing of Memory Examples for Online Task-free Continual Learning
    Jin, Xisen
    Sadhu, Arka
    Du, Junyi
    Ren, Xiang
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021, 34
  • [6] Task-Free Continual Generation and Representation Learning via Dynamic Expansionable Memory Cluster
    Ye, Fei
    Bors, Adrian G.
    THIRTY-EIGHTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, VOL 38 NO 15, 2024, : 16451 - 16459
  • [7] Task-Free Dynamic Sparse Vision Transformer for Continual Learning
    Ye, Fei
    Bors, Adrian G.
    THIRTY-EIGHTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, VOL 38 NO 15, 2024, : 16442 - 16450
  • [8] Task-aware network: Mitigation of task-aware and task-free performance gap in online continual learning
    Hong, Yongwon
    Park, Sungho
    Byun, Hyeran
    NEUROCOMPUTING, 2023, 552
  • [9] Improving Task-free Continual Learning by Distributionally Robust Memory Evolution
    Wang, Zhenyi
    Shen, Li
    Fang, Le
    Suo, Qiuling
    Duan, Tiehang
    Gao, Mingchen
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 162, 2022,
  • [10] Rethinking Trajectory Prediction in Real-World Applications: An Online Task-Free Continual Learning Perspective
    Lin, Yunlong
    Li, Zirui
    Gong, Cheng
    Liu, Qi
    Lu, Chao
    Gong, Jianwei
    2023 IEEE 26TH INTERNATIONAL CONFERENCE ON INTELLIGENT TRANSPORTATION SYSTEMS, ITSC, 2023, : 5020 - 5026