TINC: Tree-structured Implicit Neural Compression

被引:4
|
作者
Yang, Runzhao [1 ]
机构
[1] Tsinghua Univ, Dept Automat, Beijing 100084, Peoples R China
基金
中国国家自然科学基金; 北京市自然科学基金;
关键词
D O I
10.1109/CVPR52729.2023.01776
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Implicit neural representation (INR) can describe the target scenes with high fidelity using a small number of parameters, and is emerging as a promising data compression technique. However, limited spectrum coverage is intrinsic to INR, and it is non-trivial to remove redundancy in diverse complex data effectively. Preliminary studies can only exploit either global or local correlation in the target data and thus of limited performance. In this paper, we propose a Tree-structured Implicit Neural Compression (TINC) to conduct compact representation for local regions and extract the shared features of these local representations in a hierarchical manner. Specifically, we use Multi-Layer Perceptrons (MLPs) to fit the partitioned local regions, and these MLPs are organized in tree structure to share parameters according to the spatial distance. The parameter sharing scheme not only ensures the continuity between adjacent regions, but also jointly removes the local and non-local redundancy. Extensive experiments show that TINC improves the compression fidelity of INR, and has shown impressive compression capabilities over commercial tools and other deep learning based methods. Besides, the approach is of high flexibility and can be tailored for different data and parameter settings. The source code can be found at https://github.com/RichealYoung/TINC.
引用
收藏
页码:18517 / 18526
页数:10
相关论文
共 50 条
  • [1] Tree-structured neural decoding
    d'Avignon, C
    Geman, D
    JOURNAL OF MACHINE LEARNING RESEARCH, 2004, 4 (04) : 743 - 754
  • [2] Image compression by tree-structured segmentation
    Iran J Sci Technol, 3 (381-388):
  • [3] Image compression by tree-structured segmentation
    Asli, AZ
    Rajaei, A
    IRANIAN JOURNAL OF SCIENCE AND TECHNOLOGY, 1998, 22 (03): : 381 - 388
  • [4] Tree-Structured Binary Neural Networks
    Serbetci, Ayse
    Akgul, Yusuf Sinan
    29TH IEEE CONFERENCE ON SIGNAL PROCESSING AND COMMUNICATIONS APPLICATIONS (SIU 2021), 2021,
  • [5] Tree-Structured Neural Topic Model
    Isonuma, Masaru
    Mori, Junichiro
    Bollegala, Danushka
    Sakata, Ichiro
    58TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2020), 2020, : 800 - 806
  • [6] Image compression with learnt tree-structured dictionaries
    Monaci, G
    Jost, P
    Vandergheynst, P
    2004 IEEE 6TH WORKSHOP ON MULTIMEDIA SIGNAL PROCESSING, 2004, : 35 - 38
  • [7] Tree-structured multilayer neural network for classification
    Shiueng-Bien Yang
    Neural Computing and Applications, 2020, 32 : 5859 - 5873
  • [8] Tree-Structured Neural Network for Hyperspectral Pansharpening
    He, Lin
    Ye, Hanghui
    Xi, Dahan
    Li, Jun
    Plaza, Antonio
    Zhang, Mei
    IEEE JOURNAL OF SELECTED TOPICS IN APPLIED EARTH OBSERVATIONS AND REMOTE SENSING, 2024, 17 : 2516 - 2530
  • [9] Tree-structured multilayer neural network for classification
    Yang, Shiueng-Bien
    NEURAL COMPUTING & APPLICATIONS, 2020, 32 (10): : 5859 - 5873
  • [10] Piecewise linear tree-structured models for lossless image compression
    Slyz, MJ
    Neuhoff, DL
    DCC '96 - DATA COMPRESSION CONFERENCE, PROCEEDINGS, 1996, : 260 - 269