Incremental Learning with Differentiable Architecture and Forgetting Search

被引:1
|
作者
Smith, James Seale [1 ]
Seymour, Zachary [2 ]
Chiu, Han-Pang [2 ]
机构
[1] Georgia Inst Technol, Atlanta, GA 30332 USA
[2] SRI Int, Princeton, NJ USA
关键词
continual learning; incremental learning; neural architecture search;
D O I
10.1109/IJCNN55064.2022.9892400
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
As progress is made on training machine learning models on incrementally expanding classification tasks (i.e., incremental learning), a next step is to translate this progress to industry expectations. One technique missing from incremental learning is automatic architecture design via Neural Architecture Search (NAS). In this paper, we show that leveraging NAS for incremental learning results in strong performance gains for classification tasks. Specifically, we contribute the following: first, we create a strong baseline approach for incremental learning based on Differentiable Architecture Search (DARTS) and state-of-the-art incremental learning strategies, outperforming many existing strategies trained with similar-sized popular architectures; second, we extend the idea of architecture search to regularize architecture forgetting, boosting performance past our proposed baseline. We evaluate our method on both RF signal and image classification tasks, and demonstrate we can achieve up to a 10% performance increase over state-of-the-art methods. Most importantly, our contribution enables learning from continuous distributions on real-world application data for which the complexity of the data distribution is unknown, or the modality less explored (such as RF signal classification).
引用
收藏
页数:8
相关论文
共 50 条
  • [31] MergeNAS: Merge Operations into One for Differentiable Architecture Search
    Wang, Xiaoxing
    Xue, Chao
    Yan, Junchi
    Yang, Xiaokang
    Hu, Yonggang
    Sun, Kewei
    PROCEEDINGS OF THE TWENTY-NINTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2020, : 3065 - 3072
  • [32] Riemannian Walk for Incremental Learning: Understanding Forgetting and Intransigence
    Chaudhry, Arslan
    Dokania, Puneet K.
    Ajanthan, Thalaiyasingam
    Torr, Philip H. S.
    COMPUTER VISION - ECCV 2018, PT XI, 2018, 11215 : 556 - 572
  • [33] An Efficient Strategy for Catastrophic Forgetting Reduction in Incremental Learning
    Doan, Huong-Giang
    Luong, Hong-Quan
    Ha, Thi-Oanh
    Pham, Thi Thanh Thuy
    ELECTRONICS, 2023, 12 (10)
  • [34] Attenuating Catastrophic Forgetting by Joint Contrastive and Incremental Learning
    Ferdinand, Quentin
    Clement, Benoit
    Oliveau, Quentin
    Le Chenadec, Gilles
    Papadakis, Panagiotis
    2022 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION WORKSHOPS, CVPRW 2022, 2022, : 3781 - 3788
  • [35] Balancing Between Forgetting and Acquisition in Incremental Subpopulation Learning
    Liang, Mingfu
    Zhou, Jiahuan
    Wei, Wei
    Wu, Ying
    COMPUTER VISION, ECCV 2022, PT XXVI, 2022, 13686 : 364 - 380
  • [36] NDARTS: A Differentiable Architecture Search Based on the Neumann Series
    Han, Xiaoyu
    Li, Chenyu
    Wang, Zifan
    Liu, Guohua
    ALGORITHMS, 2023, 16 (12)
  • [37] Mean-Shift Based Differentiable Architecture Search
    Hsieh J.-W.
    Chou C.-H.
    Chang M.-C.
    Chen P.-Y.
    Santra S.
    Huang C.-S.
    IEEE Transactions on Artificial Intelligence, 2024, 5 (03): : 1235 - 1246
  • [38] Differentiable neural architecture search with channel performance measurement
    Pan J.
    Zheng X.-C.
    Zou X.-Y.
    Kongzhi yu Juece/Control and Decision, 2024, 39 (07): : 2151 - 2160
  • [39] D-DARTS: Distributed Differentiable Architecture Search
    Heuillet, Alexandre
    Tabia, Hedi
    Arioui, Hichem
    Youcef-Toumi, Kamal
    PATTERN RECOGNITION LETTERS, 2023, 176 : 42 - 48
  • [40] DASS: Differentiable Architecture Search for Sparse Neural Networks
    Mousavi, Hamid
    Loni, Mohammad
    Alibeigi, Mina
    Daneshtalab, Masoud
    ACM TRANSACTIONS ON EMBEDDED COMPUTING SYSTEMS, 2023, 22 (05)