CONSISTENCY IS THE KEY TO FURTHER MITIGATING CATASTROPHIC FORGETTING IN CONTINUAL LEARNING

被引:0
|
作者
Bhat, Prashant [1 ]
Zonooz, Bahram [1 ]
Arani, Elahe [1 ]
机构
[1] NavInfo Europe, Adv Res Lab, Eindhoven, Netherlands
关键词
MODELS;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Deep neural networks struggle to continually learn multiple sequential tasks due to catastrophic forgetting of previously learned tasks. Rehearsal-based methods which explicitly store previous task samples in the buffer and interleave them with the current task samples have proven to be the most effective in mitigating forgetting. However, Experience Replay (ER) does not perform well under low-buffer regimes and longer task sequences as its performance is commensurate with the buffer size. Consistency in predictions of soft-targets can assist ER in preserving information pertaining to previous tasks better as soft-targets capture the rich similarity structure of the data. Therefore, we examine the role of consistency regularization in ER framework under various continual learning scenarios. We also propose to cast consistency regularization as a self-supervised pretext task thereby enabling the use of a wide variety of self-supervised learning methods as regularizers. While simultaneously enhancing model calibration and robustness to natural corruptions, regularizing consistency in predictions results in lesser forgetting across all continual learning scenarios. Among the different families of regularizers, we find that stricter consistency constraints preserve previous task information in ER better. (1)
引用
收藏
页数:18
相关论文
共 50 条
  • [1] Quantum Continual Learning Overcoming Catastrophic Forgetting
    Jiang, Wenjie
    Lu, Zhide
    Deng, Dong-Ling
    CHINESE PHYSICS LETTERS, 2022, 39 (05)
  • [2] Quantum Continual Learning Overcoming Catastrophic Forgetting
    蒋文杰
    鲁智徳
    邓东灵
    Chinese Physics Letters, 2022, 39 (05) : 29 - 41
  • [3] Overcoming Catastrophic Forgetting in Massively Multilingual Continual Learning
    Winata, Genta Indra
    Xie, Lingjue
    Radhakrishnan, Karthik
    Wu, Shijie
    Jin, Xisen
    Cheng, Pengxiang
    Kulkarni, Mayank
    Preotiuc-Pietro, Daniel
    FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, ACL 2023, 2023, : 768 - 777
  • [4] Mitigating Catastrophic Forgetting with Complementary Layered Learning
    Mondesire, Sean
    Wiegand, R. Paul
    ELECTRONICS, 2023, 12 (03)
  • [5] Continual Learning for Instance Segmentation to Mitigate Catastrophic Forgetting
    Lee, Jeong Jun
    Lee, Seung Il
    Kim, Hyun
    18TH INTERNATIONAL SOC DESIGN CONFERENCE 2021 (ISOCC 2021), 2021, : 85 - 86
  • [6] Mitigating Forgetting in Online Continual Learning with Neuron Calibration
    Yin, Haiyan
    Yang, Peng
    Li, Ping
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021, 34
  • [7] Preempting Catastrophic Forgetting in Continual Learning Models by Anticipatory Regularization
    El Khatib, Alaa
    Karray, Fakhri
    2019 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2019,
  • [8] Understanding Catastrophic Forgetting of Gated Linear Networks in Continual Learning
    Munari, Matteo
    Pasa, Luca
    Zambon, Daniele
    Alippi, Cesare
    Navarin, Nicolo
    2022 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2022,
  • [9] Mitigating Catastrophic Forgetting in Robot Continual Learning: A Guided Policy Search Approach Enhanced With Memory-Aware Synapses
    Dong, Qingwei
    Zeng, Peng
    He, Yunpeng
    Wan, Guangxi
    Dong, Xiaoting
    IEEE ROBOTICS AND AUTOMATION LETTERS, 2024, 9 (12): : 11242 - 11249
  • [10] Learn to Grow: A Continual Structure Learning Framework for Overcoming Catastrophic Forgetting
    Li, Xilai
    Zhou, Yingbo
    Wu, Tianfu
    Socher, Richard
    Xiong, Caiming
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 97, 2019, 97