SP-GAN: Self-Growing and Pruning Generative Adversarial Networks

被引:17
|
作者
Song, Xiaoning [1 ]
Chen, Yao [1 ]
Feng, Zhen-Hua [2 ,3 ]
Hu, Guosheng [4 ]
Yu, Dong-Jun [5 ]
Wu, Xiao-Jun [1 ]
机构
[1] Jiangnan Univ, Sch Artificial Intelligence & Comp Sci, Wuxi 214122, Jiangsu, Peoples R China
[2] Univ Surrey, Dept Comp Sci, Guildford GU2 7XH, Surrey, England
[3] Univ Surrey, Ctr Vis Speech & Signal Proc CVSSP, Guildford GU2 7XH, Surrey, England
[4] Anyvision, Belfast BT3 9DT, Antrim, North Ireland
[5] Nanjing Univ Sci & Technol, Sch Comp Sci & Engn, Nanjing 210094, Peoples R China
基金
中国国家自然科学基金; 英国工程与自然科学研究理事会; 中国博士后科学基金;
关键词
Gallium nitride; Training; Generative adversarial networks; Generators; Adaptation models; Convolution; Stability analysis; Adaptive loss function; generative adversarial networks (GANs); pruning; self-growing;
D O I
10.1109/TNNLS.2020.3005574
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
This article presents a new Self-growing and Pruning Generative Adversarial Network (SP-GAN) for realistic image generation. In contrast to traditional GAN models, our SP-GAN is able to dynamically adjust the size and architecture of a network in the training stage by using the proposed self-growing and pruning mechanisms. To be more specific, we first train two seed networks as the generator and discriminator; each contains a small number of convolution kernels. Such small-scale networks are much easier and faster to train than large-capacity networks. Second, in the self-growing step, we replicate the convolution kernels of each seed network to augment the scale of the network, followed by fine-tuning the augmented/expanded network. More importantly, to prevent the excessive growth of each seed network in the self-growing stage, we propose a pruning strategy that reduces the redundancy of an augmented network, yielding the optimal scale of the network. Finally, we design a new adaptive loss function that is treated as a variable loss computational process for the training of the proposed SP-GAN model. By design, the hyperparameters of the loss function can dynamically adapt to different training stages. Experimental results obtained on a set of data sets demonstrate the merits of the proposed method, especially in terms of the stability and efficiency of network training. The source code of the proposed SP-GAN method is publicly available at https://github.com/Lambert-chen/SPGAN.git.
引用
收藏
页码:2458 / 2469
页数:12
相关论文
共 50 条
  • [1] SP-GAN: Cycle-Consistent Generative Adversarial Networks for Shadow Puppet Generation
    Tong, Yanxin
    Xu, Jiale
    Du, Xuan
    Huang, Jingzhou
    Zhou, Houpan
    2024 IEEE INTERNATIONAL CONFERENCE ON CYBERNETICS AND INTELLIGENT SYSTEMS, CIS AND IEEE INTERNATIONAL CONFERENCE ON ROBOTICS, AUTOMATION AND MECHATRONICS, RAM, CIS-RAM 2024, 2024, : 32 - 38
  • [2] Semi-supervised self-growing generative adversarial networks for image recognition
    Xu, Zhiwei
    Wang, Haoqian
    Yang, Yi
    MULTIMEDIA TOOLS AND APPLICATIONS, 2021, 80 (11) : 17461 - 17486
  • [3] Semi-supervised self-growing generative adversarial networks for image recognition
    Zhiwei Xu
    Haoqian Wang
    Yi Yang
    Multimedia Tools and Applications, 2021, 80 : 17461 - 17486
  • [4] αβ-GAN: Robust generative adversarial networks
    Aurele Tohokantche, Aurele Tohokantche
    Cao, Wenming
    Mao, Xudong
    Wu, Si
    Wong, Hau-San
    Li, Qing
    INFORMATION SCIENCES, 2022, 593 : 177 - 200
  • [5] RG-GAN: Dynamic Regenerative Pruning for Data-Efficient Generative Adversarial Networks
    Saxena, Divya
    Cao, Jiannong
    Xu, Jiahao
    Kulshrestha, Tarun
    THIRTY-EIGHTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, VOL 38 NO 5, 2024, : 4704 - 4712
  • [6] Self-Diagnosing GAN: Diagnosing Underrepresented Samples in Generative Adversarial Networks
    Lee, Jinhee
    Kim, Haeri
    Hong, Youngkyu
    Chung, Hye Won
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021, 34
  • [7] Incremental self-growing neural networks with the changing environment
    Su, L.
    Guan, S.U.
    Yeo, Y.C.
    Journal of Intelligent Systems, 2001, 11 (01) : 43 - 74
  • [8] A Paradigm for the Development of Self-Growing Energy-Aware Networks
    Chochliouros, Ioannis P.
    Spiliopoulou, Anastasia S.
    Sfakianakis, Evangelos
    Mitsopoulou, Nina
    Alonistioti, Nancy
    Stamatelatos, Makis
    Chatzimisios, Periklis
    2012 8TH INTERNATIONAL WIRELESS COMMUNICATIONS AND MOBILE COMPUTING CONFERENCE (IWCMC), 2012, : 654 - 659
  • [9] GAN-GLS: Generative Lyric Steganography Based on Generative Adversarial Networks
    Wang, Cuilin
    Liu, Yuling
    Tong, Yongju
    Wang, Jingwen
    CMC-COMPUTERS MATERIALS & CONTINUA, 2021, 69 (01): : 1375 - 1390
  • [10] Self-Attention Generative Adversarial Networks
    Zhang, Han
    Goodfellow, Ian
    Metaxas, Dimitris
    Odena, Augustus
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 97, 2019, 97