A Comprehensive Survey of Forgetting in Deep Learning Beyond Continual Learning

被引:0
|
作者
Wang, Zhenyi [1 ]
Yang, Enneng [2 ]
Shen, Li [3 ]
Huang, Heng [1 ]
机构
[1] Univ Maryland, Dept Comp Sci, College Pk, MD 20742 USA
[2] Northeastern Univ, Shenyang 110819, Liaoning, Peoples R China
[3] Sun Yat Sen Univ, Guangzhou, Peoples R China
关键词
Beneficial forgetting; harmful forgetting; memorization; distribution shift; cross-disciplinary research;
D O I
10.1109/TPAMI.2024.3498346
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Forgetting refers to the loss or deterioration of previously acquired knowledge. While existing surveys on forgetting have primarily focused on continual learning, forgetting is a prevalent phenomenon observed in various other research domains within deep learning. Forgetting manifests in research fields such as generative models due to generator shifts, and federated learning due to heterogeneous data distributions across clients. Addressing forgetting encompasses several challenges, including balancing the retention of old task knowledge with fast learning of new task, managing task interference with conflicting goals, and preventing privacy leakage, etc. Moreover, most existing surveys on continual learning implicitly assume that forgetting is always harmful. In contrast, our survey argues that forgetting is a double-edged sword and can be beneficial and desirable in certain cases, such as privacy-preserving scenarios. By exploring forgetting in a broader context, we present a more nuanced understanding of this phenomenon and highlight its potential advantages. Through this comprehensive survey, we aspire to uncover potential solutions by drawing upon ideas and approaches from various fields that have dealt with forgetting. By examining forgetting beyond its conventional boundaries, we hope to encourage the development of novel strategies for mitigating, harnessing, or even embracing forgetting in real applications.
引用
收藏
页码:1464 / 1483
页数:20
相关论文
共 50 条
  • [1] A Continual Learning Survey: Defying Forgetting in Classification Tasks
    De Lange, Matthias
    Aljundi, Rahaf
    Masana, Marc
    Parisot, Sarah
    Jia, Xu
    Leonardis, Ales
    Slabaugh, Greg
    Tuytelaars, Tinne
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2022, 44 (07) : 3366 - 3385
  • [2] Beyond Not-Forgetting: Continual Learning with Backward Knowledge Transfer
    Lin, Sen
    Yang, Li
    Fan, Deliang
    Zhang, Junshan
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35 (NEURIPS 2022), 2022,
  • [3] Catastrophic Forgetting in Deep Learning: A Comprehensive Taxonomy
    Aleixo, Everton Lima
    Colonna, Juan G.
    Cristo, Marco
    Fernandes, Everlandio
    Journal of the Brazilian Computer Society, 2024, 30 (01) : 175 - 211
  • [4] Continual Deep Reinforcement Learning to Prevent Catastrophic Forgetting in Jamming Mitigation
    Davaslioglu, Kemal
    Kompella, Sastry
    Erpek, Tugba
    Sagduyu, Yalin E.
    arXiv,
  • [5] Selective Amnesia: A Continual Learning Approach to Forgetting in Deep Generative Models
    Heng, Alvin
    Soh, Harold
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36 (NEURIPS 2023), 2023,
  • [6] Example forgetting and rehearsal in continual learning
    Benko, Beatrix
    PATTERN RECOGNITION LETTERS, 2024, 179 : 65 - 72
  • [7] Continual Deep Reinforcement Learning to Prevent Catastrophic Forgetting in Jamming Mitigation
    Nexcepta, Gaithersburg
    MD, United States
    Proc IEEE Mil Commun Conf MILCOM, 2024, (740-745):
  • [8] A Comprehensive Survey of Continual Learning: Theory, Method and Application
    Wang, Liyuan
    Zhang, Xingxing
    Su, Hang
    Zhu, Jun
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2024, 46 (08) : 5362 - 5383
  • [9] Quantum Continual Learning Overcoming Catastrophic Forgetting
    Jiang, Wenjie
    Lu, Zhide
    Deng, Dong-Ling
    CHINESE PHYSICS LETTERS, 2022, 39 (05)
  • [10] Quantum Continual Learning Overcoming Catastrophic Forgetting
    蒋文杰
    鲁智徳
    邓东灵
    Chinese Physics Letters, 2022, 39 (05) : 29 - 41