Dense Network Expansion for Class Incremental Learning

被引:14
|
作者
Hu, Zhiyuan [1 ]
Li, Yunsheng [2 ]
Lyu, Jiancheng [3 ]
Gao, Dashan [3 ]
Vasconcelos, Nuno [1 ]
机构
[1] Univ Calif San Diego, San Diego, CA 92093 USA
[2] Microsoft Cloud AI, Redmond, WA USA
[3] Qualcomm AI Res, San Diego, CA USA
关键词
D O I
10.1109/CVPR52729.2023.01141
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The problem of class incremental learning (CIL) is considered. State-of-the-art approaches use a dynamic architecture based on network expansion (NE), in which a task expert is added per task. While effective from a computational standpoint, these methods lead to models that grow quickly with the number of tasks. A new NE method, dense network expansion (DNE), is proposed to achieve a better trade-off between accuracy and model complexity. This is accomplished by the introduction of dense connections between the intermediate layers of the task expert networks, that enable the transfer of knowledge from old to new tasks via feature sharing and reusing. This sharing is implemented with a cross-task attention mechanism, based on a new task attention block (TAB), that fuses information across tasks. Unlike traditional attention mechanisms, TAB operates at the level of the feature mixing and is decoupled with spatial attentions. This is shown more effective than a joint spatial-and-task attention for CIL. The proposed DNE approach can strictly maintain the feature space of old classes while growing the network and feature scale at a much slower rate than previous methods. In result, it outperforms the previous SOTA methods by a margin of 4% in terms of accuracy, with similar or even smaller model scale.
引用
收藏
页码:11858 / 11867
页数:10
相关论文
共 50 条
  • [21] Leveraging joint incremental learning objective with data ensemble for class incremental learning
    Mazumder, Pratik
    Karim, Mohammed Asad
    Joshi, Indu
    Singh, Pravendra
    NEURAL NETWORKS, 2023, 161 : 202 - 212
  • [22] CLASS-INCREMENTAL LEARNING WITH REPETITION
    Hemati, Hamed
    Cossu, Andrea
    Carta, Antonio
    Hurtado, Julio
    Pellegrini, Lorenzo
    Bacciu, Davide
    Lomonaco, Vincenzo
    Borth, Damian
    CONFERENCE ON LIFELONG LEARNING AGENTS, VOL 232, 2023, 232 : 437 - 455
  • [23] Federated Class-Incremental Learning
    Dong, Jiahua
    Wang, Lixu
    Fang, Zhen
    Sun, Gan
    Xu, Shichao
    Wang, Xiao
    Zhu, Qi
    2022 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2022, : 10154 - 10163
  • [24] Double distillation for class incremental learning
    Onchis, Darian M.
    Samuila, Ioan-Valentin
    2021 23RD INTERNATIONAL SYMPOSIUM ON SYMBOLIC AND NUMERIC ALGORITHMS FOR SCIENTIFIC COMPUTING (SYNASC 2021), 2021, : 182 - 185
  • [25] Class-Incremental Learning: A Survey
    Zhou, Da-Wei
    Wang, Qi-Wei
    Qi, Zhi-Hong
    Ye, Han-Jia
    Zhan, De-Chuan
    Liu, Ziwei
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2024, 46 (12) : 9851 - 9873
  • [26] Rethinking class orders and transferability in class incremental learning
    He, Chen
    Wang, Ruiping
    Chen, Xilin
    PATTERN RECOGNITION LETTERS, 2022, 161 : 67 - 73
  • [27] Learning to Classify With Incremental New Class
    Zhou, Da-Wei
    Yang, Yang
    Zhan, De-Chuan
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2022, 33 (06) : 2429 - 2443
  • [28] INCREMENTAL CLASS DICTIONARY LEARNING AND OPTIMIZATION
    BERGSTEIN, PL
    LIEBERHERR, KJ
    LECTURE NOTES IN COMPUTER SCIENCE, 1991, 512 : 377 - 396
  • [29] Incremental Class Learning for Hierarchical Classification
    Park, Ju-Youn
    Kim, Jong-Hwan
    IEEE TRANSACTIONS ON CYBERNETICS, 2020, 50 (01) : 178 - 189
  • [30] Dense Siamese Network for Dense Unsupervised Learning
    Zhang, Wenwei
    Pang, Jiangmiao
    Chen, Kai
    Loy, Chen Change
    COMPUTER VISION - ECCV 2022, PT XXX, 2022, 13690 : 464 - 480