No Data Augmentation? Alternative Regularizations for Effective Training on Small Datasets

被引:1
|
作者
Brigato, Lorenzo [1 ]
Mougiakakou, Stavroula [1 ]
机构
[1] Univ Bern, ARTORG Ctr Biomed Engn Res, AI Hlth & Nutr, Bern, Switzerland
关键词
D O I
10.1109/ICCVW60793.2023.00021
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Solving image classification tasks given small training datasets remains an open challenge for modern computer vision. Aggressive data augmentation and generative models are among the most straightforward approaches to overcoming the lack of data. However, the first fails to be agnostic to varying image domains, while the latter requires additional compute and careful design. In this work, we study alternative regularization strategies to push the limits of supervised learning on small image classification datasets. In particular, along with the model size and training schedule scaling, we employ a heuristic to select (semi) optimal learning rate and weight decay couples via the norm of model parameters. By training on only 1% of the original CIFAR-10 training set (i.e., 50 images per class) and testing on ciFAIR-10, a variant of the original CIFAR without duplicated images, we reach a test accuracy of 66.5%, on par with the best state-of-the-art methods.
引用
收藏
页码:139 / 148
页数:10
相关论文
共 50 条
  • [21] Efficient and Effective Augmentation Strategy for Adversarial Training
    Addepalli, Sravanti
    Jain, Samyak
    Venkatesh Babu, R.
    Advances in Neural Information Processing Systems, 2022, 35
  • [22] Efficient and Effective Augmentation Strategy for Adversarial Training
    Addepalli, Sravanti
    Jain, Samyak
    Babu, R. Venkatesh
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35 (NEURIPS 2022), 2022,
  • [23] Efficient Training of Visual Transformers with Small Datasets
    Liu, Yahui
    Sangineto, Enver
    Bi, Wei
    Sebe, Nicu
    Lepri, Bruno
    De Nadai, Marco
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021, 34
  • [24] Data augmentation: an alternative approach to the analysis of spectroscopic data
    Conlin, AK
    Martin, EB
    Morris, AJ
    CHEMOMETRICS AND INTELLIGENT LABORATORY SYSTEMS, 1998, 44 (1-2) : 161 - 173
  • [25] Dropout training for SVMs with data augmentation
    Ning Chen
    Jun Zhu
    Jianfei Chen
    Ting Chen
    Frontiers of Computer Science, 2018, 12 : 694 - 713
  • [26] Unsupervised Data Augmentation for Consistency Training
    Xie, Qizhe
    Dai, Zihang
    Hovy, Eduard
    Luong, Minh-Thang
    Le, Quoc V.
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 33, NEURIPS 2020, 2020, 33
  • [27] Dropout training for SVMs with data augmentation
    Chen, Ning
    Zhu, Jun
    Chen, Jianfei
    Chen, Ting
    FRONTIERS OF COMPUTER SCIENCE, 2018, 12 (04) : 694 - 713
  • [28] Textual Data Augmentation for Efficient Active Learning on Tiny Datasets
    Quteineh, Husam
    Samothrakis, Spyridon
    Sutcliffe, Richard
    PROCEEDINGS OF THE 2020 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP), 2020, : 7400 - 7410
  • [29] Facial Image Augmentation from Sparse Line Features Using Small Training Data
    Hung, Shih-Kai
    Gan, John Q.
    ADVANCES IN COMPUTATIONAL INTELLIGENCE, IWANN 2021, PT I, 2021, 12861 : 547 - 558
  • [30] Exploring Data Augmentation and Active Learning Benefits in Imbalanced Datasets
    Moles, Luis
    Andres, Alain
    Echegaray, Goretti
    Boto, Fernando
    MATHEMATICS, 2024, 12 (12)