Gradient Regularization with Multivariate Distribution of Previous Knowledge for Continual Learning

被引:3
|
作者
Kim, Tae-Heon [1 ]
Moon, Hyung-Jun [2 ]
Cho, Sung-Bae [1 ,2 ]
机构
[1] Yonsei Univ, Dept Comp Sci, Seoul 03722, South Korea
[2] Yonsei Univ, Dept Artificial Intelligence, Seoul 03722, South Korea
关键词
Continual learning; Memory replay; Sample generation; Multivariate gaussian distribution; Expectation-maximization;
D O I
10.1007/978-3-031-21753-1_35
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Continual learning is a novel learning setup for an environment where data are introduced sequentially, and a model continually learns new tasks. However, the model forgets the learned knowledge as it learns new classes. There is an approach that keeps a few previous data, but this causes other problems such as overfitting and class imbalance. In this paper, we propose a method that retrains a network with generated representations from an estimated multivariate Gaussian distribution. The representations are the vectors coming from CNN that is trained using a gradient regularization to prevent a distribution shift, allowing the stored means and covariances to create realistic representations. The generated vectors contain every class seen so far, which helps preventing the forgetting. Our 6-fold cross-validation experiment shows that the proposed method outperforms the existing continual learning methods by 1.14%p and 4.60%p in CIFAR10 and CIFAR100, respectively. Moreover, we visualize the generated vectors using t-SNE to confirm the validity of multivariate Gaussian mixture to estimate the distribution of the data representations.
引用
收藏
页码:359 / 368
页数:10
相关论文
共 50 条
  • [21] Continual Learning Based on Knowledge Distillation and Representation Learning
    Chen, Xiu-Yan
    Liu, Jian-Wei
    Li, Wen-Tao
    ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING - ICANN 2022, PT IV, 2022, 13532 : 27 - 38
  • [22] Learning where to learn: Gradient sparsity in meta and continual learning
    von Oswald, Johannes
    Zhao, Dominic
    Kobayashi, Seijin
    Schug, Simon
    Caccia, Massimo
    Zucchet, Nicolas
    Sacramento, Joao
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021, 34
  • [23] Continual Learning by Contrastive Learning of Regularized Classes in Multivariate Gaussian Distributions
    Moon, Hyung-Jun
    Cho, Sung-Bae
    INTERNATIONAL JOURNAL OF NEURAL SYSTEMS, 2025,
  • [24] Contrastive Learning of Multivariate Gaussian Distributions of Incremental Classes for Continual Learning
    Moon, Hyung-Jun
    Cho, Sung-Bae
    ARTIFICIAL INTELLIGENCE FOR NEUROSCIENCE AND EMOTIONAL SYSTEMS, PT I, IWINAC 2024, 2024, 14674 : 518 - 527
  • [25] Measuring Asymmetric Gradient Discrepancy in Parallel Continual Learning
    Lyu, Fan
    Sun, Qing
    Shang, Fanhua
    Wan, Liang
    Feng, Wei
    2023 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2023), 2023, : 11377 - 11386
  • [26] Continual Learning with Knowledge Transfer for Sentiment Classification
    Ke, Zixuan
    Liu, Bing
    Wang, Hao
    Shu, Lei
    MACHINE LEARNING AND KNOWLEDGE DISCOVERY IN DATABASES, ECML PKDD 2020, PT III, 2021, 12459 : 683 - 698
  • [27] Gradient based sample selection for online continual learning
    Aljundi, Rahaf
    Lin, Min
    Goujaud, Baptiste
    Bengio, Yoshua
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 32 (NIPS 2019), 2019, 32
  • [28] Gradient Regularized Contrastive Learning for Continual Domain Adaptation
    Tang, Shixiang
    Su, Peng
    Chen, Dapeng
    Ouyang, Wanli
    THIRTY-FIFTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THIRTY-THIRD CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE AND THE ELEVENTH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2021, 35 : 2665 - 2673
  • [29] Auxiliary Local Variables for Improving Regularization/Prior Approach in Continual Learning
    Linh Ngo Van
    Nam Le Hai
    Hoang Pham
    Khoat Than
    ADVANCES IN KNOWLEDGE DISCOVERY AND DATA MINING, PAKDD 2022, PT I, 2022, 13280 : 16 - 28
  • [30] CONTINUAL SELF-SUPERVISED LEARNING IN EARTH OBSERVATION WITH EMBEDDING REGULARIZATION
    Moieez, Hamna
    Marsocci, Valerio
    Scardapane, Simone
    IGARSS 2023 - 2023 IEEE INTERNATIONAL GEOSCIENCE AND REMOTE SENSING SYMPOSIUM, 2023, : 5029 - 5032