Dynamic Self-Supervised Teacher-Student Network Learning

被引:11
|
作者
Ye, Fei [1 ]
Bors, Adrian G. [1 ]
机构
[1] Univ York, Dept Comp Sci, York YO10 5GH, England
关键词
Task analysis; Mixture models; Training; Generative adversarial networks; Data models; Computational modeling; Self-supervised learning; Lifelong learning; representation learning; self-supervised learning; teacher-student framework; ALGORITHM;
D O I
10.1109/TPAMI.2022.3220928
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Lifelong learning (LLL) represents the ability of an artificial intelligence system to learn successively a sequence of different databases. In this paper we introduce the Dynamic Self-Supervised Teacher-Student Network (D-TS), representing a more general LLL framework, where the Teacher is implemented as a dynamically expanding mixture model which automatically increases its capacity to deal with a growing number of tasks. We propose the Knowledge Discrepancy Score (KDS) criterion for measuring the relevance of the incoming information characterizing a new task when compared to the existing knowledge accumulated by the Teacher module from its previous training. The KDS ensures a light Teacher architecture while also enabling to reuse the learned knowledge whenever appropriate, accelerating the learning of given tasks. The Student module is implemented as a lightweight probabilistic generative model. We introduce a novel self-supervised learning procedure for the Student that allows to capture cross-domain latent representations from the entire knowledge accumulated by the Teacher as well as from novel data. We perform several experiments which show that D-TS can achieve the state of the art results in LLL while requiring fewer parameters than other methods.
引用
收藏
页码:5731 / 5748
页数:18
相关论文
共 50 条
  • [32] Self-Supervised Dialogue Learning
    Wu, Jiawei
    Wang, Xin
    Wang, William Yang
    57TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2019), 2019, : 3857 - 3867
  • [33] Self-Supervised Learning for Anomaly Detection With Dynamic Local Augmentation
    Yoa, Seungdong
    Lee, Seungjun
    Kim, Chiyoon
    Kim, Hyunwoo J.
    IEEE ACCESS, 2021, 9 : 147201 - 147211
  • [34] Variational Dynamic for Self-Supervised Exploration in Deep Reinforcement Learning
    Bai, Chenjia
    Liu, Peng
    Liu, Kaiyu
    Wang, Lingxiao
    Zhao, Yingnan
    Han, Lei
    Wang, Zhaoran
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2023, 34 (08) : 4776 - 4790
  • [35] Locus of control and learning in the teacher-student binome
    Blank, E
    Melet, N
    INTERNATIONAL JOURNAL OF PSYCHOLOGY, 1996, 31 (3-4) : 54174 - 54174
  • [36] The teacher-student relationship as a mediating variable of learning
    Flores Moran, John Freddy
    REVISTA SAN GREGORIO, 2019, (35): : 189 - 201
  • [37] Self-supervised learning model
    Saga, Kazushie
    Sugasaka, Tamami
    Sekiguchi, Minoru
    Fujitsu Scientific and Technical Journal, 1993, 29 (03): : 209 - 216
  • [38] Longitudinal self-supervised learning
    Zhao, Qingyu
    Liu, Zixuan
    Adeli, Ehsan
    Pohl, Kilian M.
    MEDICAL IMAGE ANALYSIS, 2021, 71
  • [39] Credal Self-Supervised Learning
    Lienen, Julian
    Huellermeier, Eyke
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021, 34
  • [40] Self-Supervised Learning for Recommendation
    Huang, Chao
    Xia, Lianghao
    Wang, Xiang
    He, Xiangnan
    Yin, Dawei
    PROCEEDINGS OF THE 31ST ACM INTERNATIONAL CONFERENCE ON INFORMATION AND KNOWLEDGE MANAGEMENT, CIKM 2022, 2022, : 5136 - 5139