Information-Theoretic GAN Compression with Variational Energy-based Model

被引:0
|
作者
Kang, Minsoo [1 ]
Yoo, Hyewon [2 ]
Kang, Eunhee [3 ]
Ki, Sehwan [3 ]
Lee, Hyong-Euk [3 ]
Han, Bohyung [1 ,2 ]
机构
[1] Seoul Natl Univ, ECE, Seoul, South Korea
[2] Seoul Natl Univ, IPAI, Seoul, South Korea
[3] Samsung Adv Inst Technol SAIT, Suwon, South Korea
基金
新加坡国家研究基金会;
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
We propose an information-theoretic knowledge distillation approach for the compression of generative adversarial networks, which aims to maximize the mutual information between teacher and student networks via a variational optimization based on an energy-based model. Because the direct computation of the mutual information in continuous domains is intractable, our approach alternatively optimizes the student network by maximizing the variational lower bound of the mutual information. To achieve a tight lower bound, we introduce an energy-based model relying on a deep neural network to represent a flexible variational distribution that deals with high-dimensional images and consider spatial dependencies between pixels, effectively. Since the proposed method is a generic optimization algorithm, it can be conveniently incorporated into arbitrary generative adversarial networks and even dense prediction networks, e.g., image enhancement models. We demonstrate that the proposed algorithm achieves outstanding performance in model compression of generative adversarial networks consistently when combined with several existing models.
引用
收藏
页数:15
相关论文
共 50 条
  • [1] Information-Theoretic Limits on Compression of Semantic Information
    Tang, Jiancheng
    Yang, Qianqian
    Zhang, Zhaoyang
    CHINA COMMUNICATIONS, 2024, 21 (07) : 1 - 16
  • [2] Information-Theoretic Limits on Compression of Semantic Information
    Tang Jiancheng
    Yang Qianqian
    Zhang Zhaoyang
    ChinaCommunications, 2024, 21 (07) : 1 - 16
  • [3] Population Risk Improvement with Model Compression: An Information-Theoretic Approach
    Bu, Yuheng
    Gao, Weihao
    Zou, Shaofeng
    Veeravalli, Venugopal V.
    ENTROPY, 2021, 23 (10)
  • [4] Information-Theoretic Understanding of Population Risk Improvement with Model Compression
    Bu, Yuheng
    Gao, Weihao
    Zou, Shaofeng
    Veeravalli, Venugopal V.
    THIRTY-FOURTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THE THIRTY-SECOND INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE AND THE TENTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2020, 34 : 3300 - 3307
  • [5] Variational information-theoretic atoms-in-molecules
    Heidar-Zadeh, Farnaz
    Verstraelen, Toon
    Vohringer-Martinez, Esteban
    Bultinck, Patrick
    Ayers, Paul
    ABSTRACTS OF PAPERS OF THE AMERICAN CHEMICAL SOCIETY, 2018, 255
  • [6] An information-theoretic model for steganography
    Cachin, C
    INFORMATION AND COMPUTATION, 2004, 192 (01) : 41 - 56
  • [7] An information-theoretic model for steganography
    Cachin, C
    INFORMATION HIDING, 1998, 1525 : 306 - 318
  • [8] ON AN INFORMATION-THEORETIC MODEL OF EXPLANATION
    WOODWARD, J
    PHILOSOPHY OF SCIENCE, 1987, 54 (01) : 21 - 44
  • [9] An Information-Theoretic Perspective on Proper Quaternion Variational Autoencoders
    Grassucci, Eleonora
    Comminiello, Danilo
    Uncini, Aurelio
    ENTROPY, 2021, 23 (07)
  • [10] Information-theoretic compression of pose graphs for laser-based SLAM
    Kretzschmar, Henrik
    Stachniss, Cyrill
    INTERNATIONAL JOURNAL OF ROBOTICS RESEARCH, 2012, 31 (11): : 1219 - 1230