Beyond Not-Forgetting: Continual Learning with Backward Knowledge Transfer

被引:0
|
作者
Lin, Sen [1 ]
Yang, Li [1 ]
Fan, Deliang [1 ]
Zhang, Junshan [2 ]
机构
[1] Arizona State Univ, Sch ECEE, Tempe, AZ 85287 USA
[2] Univ Calif Davis, Dept ECE, Davis, CA 95616 USA
基金
美国国家科学基金会;
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
By learning a sequence of tasks continually, an agent in continual learning (CL) can improve the learning performance of both a new task and 'old' tasks by leveraging the forward knowledge transfer and the backward knowledge transfer, respectively. However, most existing CL methods focus on addressing catastrophic forgetting in neural networks by minimizing the modification of the learnt model for old tasks. This inevitably limits the backward knowledge transfer from the new task to the old tasks, because judicious model updates could possibly improve the learning performance of the old tasks as well. To tackle this problem, we first theoretically analyze the conditions under which updating the learnt model of old tasks could be beneficial for CL and also lead to backward knowledge transfer, based on the gradient projection onto the input subspaces of old tasks. Building on the theoretical analysis, we next develop a ContinUal learning method with Backward knowlEdge tRansfer (CUBER), for a fixed capacity neural network without data replay. In particular, CUBER first characterizes the task correlation to identify the positively correlated old tasks in a layer-wise manner, and then selectively modifies the learnt model of the old tasks when learning the new task. Experimental studies show that CUBER can even achieve positive backward knowledge transfer on several existing CL benchmarks for the first time without data replay, where the related baselines still suffer from catastrophic forgetting (negative backward knowledge transfer). The superior performance of CUBER on the backward knowledge transfer also leads to higher accuracy accordingly.
引用
收藏
页数:13
相关论文
共 50 条
  • [1] Achieving Forgetting Prevention and Knowledge Transfer in Continual Learning
    Ke, Zixuan
    Liu, Bing
    Ma, Nianzu
    Xu, Hu
    Shu, Lei
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021, 34
  • [2] Partially Relaxed Masks for Knowledge Transfer Without Forgetting in Continual Learning
    Konishi, Tatsuya
    Kurokawa, Mori
    Ono, Chihiro
    Ke, Zixuan
    Kim, Gyuhak
    Liu, Bing
    ADVANCES IN KNOWLEDGE DISCOVERY AND DATA MINING, PAKDD 2022, PT I, 2022, 13280 : 367 - 379
  • [3] Disentangled Representations for Continual Learning: Overcoming Forgetting and Facilitating Knowledge Transfer
    Xu, Zhaopeng
    Qin, Qi
    Liu, Bing
    Zhao, Dongyan
    MACHINE LEARNING AND KNOWLEDGE DISCOVERY IN DATABASES: RESEARCH TRACK, PT IV, ECML PKDD 2024, 2024, 14944 : 143 - 159
  • [4] Quantum continual learning of quantum data realizing knowledge backward transfer
    Situ, Haozhen
    Lu, Tianxiang
    Pan, Minghua
    Li, Lvzhou
    PHYSICA A-STATISTICAL MECHANICS AND ITS APPLICATIONS, 2023, 620
  • [5] A Comprehensive Survey of Forgetting in Deep Learning Beyond Continual Learning
    Wang, Zhenyi
    Yang, Enneng
    Shen, Li
    Huang, Heng
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2025, 47 (03) : 1464 - 1483
  • [6] AFEC: Active Forgetting of Negative Transfer in Continual Learning
    Wang, Liyuan
    Zhang, Mingtian
    Jia, Zhongfan
    Li, Qian
    Ma, Kaisheng
    Bao, Chenglong
    Zhu, Jun
    Zhong, Yi
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021, 34
  • [7] A THEORY FOR KNOWLEDGE TRANSFER IN CONTINUAL LEARNING
    Benavides-Prado, Diana
    Riddle, Patricia
    CONFERENCE ON LIFELONG LEARNING AGENTS, VOL 199, 2022, 199
  • [8] Continual Learning with Knowledge Transfer for Sentiment Classification
    Ke, Zixuan
    Liu, Bing
    Wang, Hao
    Shu, Lei
    MACHINE LEARNING AND KNOWLEDGE DISCOVERY IN DATABASES, ECML PKDD 2020, PT III, 2021, 12459 : 683 - 698
  • [9] Time scales of knowledge transfer with learning and forgetting
    Lin, Min
    Zhang, Qun
    PHYSICA A-STATISTICAL MECHANICS AND ITS APPLICATIONS, 2019, 525 : 704 - 713
  • [10] Example forgetting and rehearsal in continual learning
    Benko, Beatrix
    PATTERN RECOGNITION LETTERS, 2024, 179 : 65 - 72