TINC: Tree-structured Implicit Neural Compression

被引:4
|
作者
Yang, Runzhao [1 ]
机构
[1] Tsinghua Univ, Dept Automat, Beijing 100084, Peoples R China
基金
中国国家自然科学基金; 北京市自然科学基金;
关键词
D O I
10.1109/CVPR52729.2023.01776
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Implicit neural representation (INR) can describe the target scenes with high fidelity using a small number of parameters, and is emerging as a promising data compression technique. However, limited spectrum coverage is intrinsic to INR, and it is non-trivial to remove redundancy in diverse complex data effectively. Preliminary studies can only exploit either global or local correlation in the target data and thus of limited performance. In this paper, we propose a Tree-structured Implicit Neural Compression (TINC) to conduct compact representation for local regions and extract the shared features of these local representations in a hierarchical manner. Specifically, we use Multi-Layer Perceptrons (MLPs) to fit the partitioned local regions, and these MLPs are organized in tree structure to share parameters according to the spatial distance. The parameter sharing scheme not only ensures the continuity between adjacent regions, but also jointly removes the local and non-local redundancy. Extensive experiments show that TINC improves the compression fidelity of INR, and has shown impressive compression capabilities over commercial tools and other deep learning based methods. Besides, the approach is of high flexibility and can be tailored for different data and parameter settings. The source code can be found at https://github.com/RichealYoung/TINC.
引用
收藏
页码:18517 / 18526
页数:10
相关论文
共 50 条
  • [31] Tree-Structured Shading Decomposition
    Geng, Chen
    Yu, Hong-Xing
    Zhang, Sharon
    Agrawala, Maneesh
    Wu, Jiajun
    2023 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION, ICCV, 2023, : 488 - 498
  • [32] Tree-Structured Haar Transforms
    Karen Egiazarian
    Jaakko Astola
    Journal of Mathematical Imaging and Vision, 2002, 16 : 269 - 279
  • [33] Diagnosing tree-structured systems
    Stumptner, M
    Wotawa, F
    ARTIFICIAL INTELLIGENCE, 2001, 127 (01) : 1 - 29
  • [34] Tree-Structured Readings of the Tractatus
    Stern, David
    WITTGENSTEIN-STUDIEN, 2023, 14 (01) : 223 - 262
  • [35] Tree-Structured Neural Machine for Linguistics-Aware Sentence Generation
    Zhou, Ganbin
    Luo, Ping
    Cao, Rongyu
    Xiao, Yijun
    Lin, Fen
    Chen, Bo
    He, Qing
    THIRTY-SECOND AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE / THIRTIETH INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE / EIGHTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2018, : 5722 - 5729
  • [36] Tree-Structured Neural Networks With Topic Attention for Social Emotion Classification
    Wang, Chang
    Wang, Bang
    Xu, Minghua
    IEEE ACCESS, 2019, 7 : 95505 - 95515
  • [37] Digital Modulation Recognition Method Based on Tree-Structured Neural Networks
    Xu, Yiqiong
    Ge, Lindong
    Wang, Bo
    PROCEEDINGS OF THE INTERNATIONAL CONFERENCE ON COMMUNICATION SOFTWARE AND NETWORKS, 2009, : 708 - 712
  • [38] A tree-structured artificial neural network for chaotic time series prediction
    Andras, P
    ARTIFICIAL INTELLIGENCE: METHODOLOGY, SYSTEMS, APPLICATIONS, 1996, 35 : 119 - 125
  • [39] Selectively tree-structured vector quantizer using Kohonen neural network
    Wang, W
    Li, X
    Lu, DJ
    ICSP '96 - 1996 3RD INTERNATIONAL CONFERENCE ON SIGNAL PROCESSING, PROCEEDINGS, VOLS I AND II, 1996, : 1504 - 1507
  • [40] Tree-structured modelling of varying coefficients
    Berger, Moritz
    Tutz, Gerhard
    Schmid, Matthias
    STATISTICS AND COMPUTING, 2019, 29 (02) : 217 - 229