Learning to Generate Diverse Data From a Temporal Perspective for Data-Free Quantization

被引:0
|
作者
Luo, Hui [1 ,2 ,3 ]
Zhang, Shuhai [4 ,5 ]
Zhuang, Zhuangwei [4 ,5 ]
Mai, Jiajie [4 ]
Tan, Mingkui [4 ,5 ]
Zhang, Jianlin [1 ,2 ,3 ]
机构
[1] Chinese Acad Sci, Natl Key Lab Opt Field Manipulat Sci & Technol, Chengdu 610209, Peoples R China
[2] Chinese Acad Sci, Inst Opt & Elect, Key Lab Opt Engn, Chengdu 610209, Peoples R China
[3] Univ Chinese Acad Sci, Sch Elect Elect & Commun Engn, Beijing 100049, Peoples R China
[4] South China Univ Technol, Sch Software Engn, Guangzhou 510641, Peoples R China
[5] South China Univ Technol, Minist Educ, Key Lab Big Data & Intelligent Robot, Guangzhou 510641, Peoples R China
基金
中国国家自然科学基金;
关键词
Quantization (signal); Data models; Synthetic data; Generators; Computational modeling; Training; Analytical models; Model quantization; data-free quantization; generation process; synthetic data; linear interpolation; BINARY NEURAL-NETWORKS; ACCURATE;
D O I
10.1109/TCSVT.2024.3399311
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Model quantization is a prevalent method to compress and accelerate neural networks. Most existing quantization methods usually require access to real data to improve the performance of quantized models, which is often infeasible in some scenarios with privacy and security concerns. Recently, data-free quantization has been widely studied to solve the challenge of not having access to real data by generating synthetic data, among which generator-based data-free quantization is an important type. Previous generator-based methods focus on improving the performance of quantized models by optimizing the spatial distribution of synthetic data, while ignoring the study of changes in synthetic data from a temporal perspective. In this work, we reveal that generator-based data-free quantization methods usually suffer from the issue that synthetic data show homogeneity in the mid-to-late stages of the generation process due to the stagnation of the generator update, which hinders further improvement of the performance of quantized models. To solve the above issue, we propose introducing the discrepancy between the full-precision and quantized models as new supervision information to update the generator. Specifically, we propose a simple yet effective adversarial Gaussian-margin loss, which promotes continuous updating of the generator by adding more supervision information to the generator when the discrepancy between the full-precision and quantized models is small, thereby generating heterogeneous synthetic data. Moreover, to mitigate the homogeneity of the synthetic data further, we augment the synthetic data with linear interpolation. Our proposed method can also promote the performance of other generator-based data-free quantization methods. Extensive experimental results show that our proposed method achieves superior performances for various settings on data-free quantization, especially in ultra-low-bit settings, such as 3-bit.
引用
收藏
页码:9484 / 9498
页数:15
相关论文
共 50 条
  • [1] Adaptive Data-Free Quantization
    Qian, Biao
    Wang, Yang
    Hong, Richang
    Wang, Meng
    2023 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION, CVPR, 2023, : 7960 - 7968
  • [2] Diverse Sample Generation: Pushing the Limit of Generative Data-Free Quantization
    Qin, Haotong
    Ding, Yifu
    Zhang, Xiangguo
    Wang, Jiakai
    Liu, Xianglong
    Lu, Jiwen
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2023, 45 (10) : 11689 - 11706
  • [3] Data-Free Network Quantization With Adversarial Knowledge Distillation
    Choi, Yoojin
    Choi, Jihwan
    El-Khamy, Mostafa
    Lee, Jungwon
    2020 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION WORKSHOPS (CVPRW 2020), 2020, : 3047 - 3057
  • [4] META-BNS FOR ADVERSARIAL DATA-FREE QUANTIZATION
    Fu, Siming
    Wang, Hualiang
    Cao, Yuchen
    Hu, Haoji
    Peng, Bo
    Tan, Wenming
    Ye, Tingqun
    2022 IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING, ICIP, 2022, : 4038 - 4042
  • [5] Diversifying Sample Generation for Accurate Data-Free Quantization
    Zhang, Xiangguo
    Qin, Haotong
    Ding, Yifu
    Gong, Ruihao
    Yan, Qinghua
    Tao, Renshuai
    Li, Yuhang
    Yu, Fengwei
    Liu, Xianglong
    2021 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION, CVPR 2021, 2021, : 15653 - 15662
  • [6] EasyQuant: An Efficient Data-free Quantization Algorithm for LLMs
    Tang, Hanlin
    Sun, Yifu
    Wu, Decheng
    Liu, Kai
    Zhu, Jianchen
    Kang, Zhanhui
    2023 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP 2023), 2023, : 9119 - 9128
  • [7] Data-Free Quantization ThroughWeight Equalization and Bias Correction
    Nagel, Markus
    van Baalen, Mart
    Blankevoort, Tijmen
    Welling, Max
    2019 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2019), 2019, : 1325 - 1334
  • [8] REx: Data-Free Residual Quantization Error Expansion
    Yvinec, Edouard
    Dapogny, Arnaud
    Cord, Matthieu
    Bailly, Kevin
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36 (NEURIPS 2023), 2023,
  • [9] Data-Free Learning of Student Networks
    Chen, Hanting
    Wang, Yunhe
    Xu, Chang
    Yang, Zhaohui
    Liu, Chuanjian
    Shi, Boxin
    Xu, Chunjing
    Xu, Chao
    Tian, Qi
    2019 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2019), 2019, : 3513 - 3521
  • [10] Customizing Synthetic Data for Data-Free Student Learning
    Luo, Shiya
    Chen, Defang
    Wang, Can
    2023 IEEE INTERNATIONAL CONFERENCE ON MULTIMEDIA AND EXPO, ICME, 2023, : 1817 - 1822