Memory efficient data-free distillation for continual learning

被引:4
|
作者
Li, Xiaorong [1 ]
Wang, Shipeng [1 ]
Sun, Jian [1 ]
Xu, Zongben [1 ]
机构
[1] Xi An Jiao Tong Univ, Sch Math & Stat, Xian, Shaanxi, Peoples R China
基金
国家重点研发计划;
关键词
Continual learning; Catastrophic forgetting; Knowledge distillation;
D O I
10.1016/j.patcog.2023.109875
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Deep neural networks suffer from the catastrophic forgetting phenomenon when trained on sequential tasks in continual learning, especially when data from previous tasks are unavailable. To mitigate catastrophic forgetting, various methods either store data from previous tasks, which may raise privacy concerns, or require large memory storage. Particularly, the distillation-based methods mitigate catastrophic forgetting by using proxy datasets. However, proxy datasets may not match the distributions of the original datasets of previous tasks. To address these problems in a setting where the full training data of previous tasks are unavailable and memory resources are limited, we propose a novel data-free distillation method. Our method encodes knowledge of previous tasks into network parameter gradients by Taylor expansion, deducing a regularizer relying on gradients in network training loss. To improve memory efficiency, we design an approach to compressing the gradients in the regularizer. Moreover, we theoretically analyze the approximation error of our method. Experimental results on multiple datasets demonstrate that our proposed method outperforms the existing approaches in continual learning.
引用
收藏
页数:9
相关论文
共 50 条
  • [41] Momentum Adversarial Distillation: Handling Large Distribution Shifts in Data-Free Knowledge Distillation
    Do, Kien
    Le, Hung
    Dung Nguyen
    Dang Nguyen
    Harikumar, Haripriya
    Truyen Tran
    Rana, Santu
    Venkatesh, Svetha
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35 (NEURIPS 2022), 2022,
  • [42] Customizing Synthetic Data for Data-Free Student Learning
    Luo, Shiya
    Chen, Defang
    Wang, Can
    2023 IEEE INTERNATIONAL CONFERENCE ON MULTIMEDIA AND EXPO, ICME, 2023, : 1817 - 1822
  • [43] Class Impression for Data-Free Incremental Learning
    Ayromlou, Sana
    Abolmaesumi, Purang
    Tsang, Teresa
    Li, Xiaoxiao
    MEDICAL IMAGE COMPUTING AND COMPUTER ASSISTED INTERVENTION, MICCAI 2022, PT IV, 2022, 13434 : 320 - 329
  • [44] D3K: Dynastic Data-Free Knowledge Distillation
    Li, Xiufang
    Sun, Qigong
    Jiao, Licheng
    Liu, Fang
    Liu, Xu
    Li, Lingling
    Chen, Puhua
    Zuo, Yi
    IEEE TRANSACTIONS ON MULTIMEDIA, 2023, 25 : 8358 - 8371
  • [45] ENHANCING DATA-FREE ADVERSARIAL DISTILLATION WITH ACTIVATION REGULARIZATION AND VIRTUAL INTERPOLATION
    Qu, Xiaoyang
    Wang, Jianzong
    Xiao, Jing
    2021 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP 2021), 2021, : 3340 - 3344
  • [46] Data-Free Knowledge Distillation with Soft Targeted Transfer Set Synthesis
    Wang, Zi
    THIRTY-FIFTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THIRTY-THIRD CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE AND THE ELEVENTH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2021, 35 : 10245 - 10253
  • [47] Learning to Retain while Acquiring: Combating Distribution-Shift in Adversarial Data-Free Knowledge Distillation
    Patel, Gaurav
    Mopuri, Konda Reddy
    Qiu, Qiang
    2023 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION, CVPR, 2023, : 7786 - 7794
  • [48] Conditional pseudo-supervised contrast for data-Free knowledge distillation
    Shao, Renrong
    Zhang, Wei
    Wang, Jun
    PATTERN RECOGNITION, 2023, 143
  • [49] Adversarial Self-Supervised Data-Free Distillation for Text Classification
    Ma, Xinyin
    Shen, Yongliang
    Fang, Gongfan
    Chen, Chen
    Jia, Chenghao
    Lu, Weiming
    PROCEEDINGS OF THE 2020 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP), 2020, : 6182 - 6192
  • [50] Data-free Knowledge Distillation for Fine-grained Visual Categorization
    Shao, Renrong
    Zhang, Wei
    Yin, Jianhua
    Wang, Jun
    2023 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION, ICCV, 2023, : 1515 - 1525