Learnable GAN Regularization for Improving Training Stability in Limited Data Paradigm

被引:1
|
作者
Singh, Nakul [1 ]
Sandhan, Tushar [1 ]
机构
[1] Indian Inst Technol Kanpur, Elect Engn, Percept & Intelligence Lab, Kanpur, Uttar Pradesh, India
来源
COMPUTER VISION AND IMAGE PROCESSING, CVIP 2023, PT II | 2024年 / 2010卷
关键词
GAN; Generator; Discriminator; Regularization; Overfitting; Limited data;
D O I
10.1007/978-3-031-58174-8_45
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Generative adversarial networks (GAN) are generative models that require large amounts of training data to ensure a stable learning trajectory during the training phase. In the absence of sufficient data, GAN suffers from unstable training dynamics that adversely affect the quality of generated data. This behavior is attributed to the adversarial learning process and the classifier-like functioning of the discriminator. In data-deficient cases, the adversarial learning procedure leads to the discriminator memorizing the data instead of generalizing. Due to their wide applicability in several generative tasks, improving the GAN performance in the limited data paradigm will further advance their usage in data-scarce fields. Therefore to circumvent this issue, we propose a loss-regularized GAN, which improves the performance by forcing a strong regularization on the discriminator. We conduct several experiments using limited data from the CIFAR-10 and CIFAR-100 datasets to investigate the effectiveness of the proposed model in overcoming discriminator overfitting in the lack of abundant data. We observe consistent performance improvement across all the experiments compared to state-of-the-art models.
引用
收藏
页码:542 / 554
页数:13
相关论文
共 50 条
  • [21] Unifying Adversarial Training Algorithms with Data Gradient Regularization
    Ororbia, Alexander G., II
    Kifer, Daniel
    Giles, C. Lee
    NEURAL COMPUTATION, 2017, 29 (04) : 867 - 887
  • [22] A Framework for Improving the Performance of Verification Algorithms with a Low False Positive Rate Requirement and Limited Training Data
    Arandelovic, Ognjen
    2014 IEEE/IAPR INTERNATIONAL JOINT CONFERENCE ON BIOMETRICS (IJCB 2014), 2014,
  • [23] Improving the performance of multi-layer Perceptrons where limited training data are available for some classes
    Parikh, CR
    Pont, MJ
    Jones, NB
    NINTH INTERNATIONAL CONFERENCE ON ARTIFICIAL NEURAL NETWORKS (ICANN99), VOLS 1 AND 2, 1999, (470): : 227 - 232
  • [24] Improving Machine Hearing on Limited Data Sets
    Harar, Pavol
    Bammer, Roswitha
    Breger, Anna
    Dorfler, Monika
    Smekal, Zdenek
    2019 11TH INTERNATIONAL CONGRESS ON ULTRA MODERN TELECOMMUNICATIONS AND CONTROL SYSTEMS AND WORKSHOPS (ICUMT), 2019,
  • [25] Tri-Training for Authorship Attribution with Limited Training Data
    Qian, Tieyun
    Liu, Bing
    Chen, Li
    Peng, Zhiyong
    PROCEEDINGS OF THE 52ND ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, VOL 2, 2014, : 345 - 351
  • [26] AdaptiveMix: Improving GAN Training via Feature Space Shrinkage
    Liu, Haozhe
    Zhang, Wentian
    Li, Bing
    Wu, Haoqian
    He, Nanjun
    Huang, Yawen
    Li, Yuexiang
    Ghanem, Bernard
    Zheng, Yefeng
    2023 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2023, : 16219 - 16229
  • [27] Improving the Robustness of Data-Driven Fuzzy Systems with Regularization
    Lughofer, Edwin
    Kindermann, Stefan
    2008 IEEE INTERNATIONAL CONFERENCE ON FUZZY SYSTEMS, VOLS 1-5, 2008, : 703 - +
  • [28] Improving GAN Training with Probability Ratio Clipping and Sample Reweighting
    Wu, Yue
    Zhou, Pan
    Wilson, Andrew Gordon
    Xing, Eric P.
    Hu, Zhiting
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 33, NEURIPS 2020, 2020, 33
  • [29] Training speaker recognition systems with limited data
    Vaessen, Nik
    van Leeuwen, David A.
    INTERSPEECH 2022, 2022, : 4760 - 4764
  • [30] Recognizing New Activities with Limited Training Data
    Nguyen, Le T.
    Zeng, Ming
    Tague, Patrick
    Zhang, Joy
    ISWC 2015: PROCEEDINGS OF THE 2015 ACM INTERNATIONAL SYMPOSIUM ON WEARABLE COMPUTERS, 2015, : 67 - 74