Hybrid-context-based multi-prior entropy modeling for learned lossless image compression

被引:0
|
作者
Fu, Chuan [1 ]
Du, Bo [2 ]
Zhang, Liangpei [3 ]
机构
[1] Chongqing Univ, Coll Comp Sci, Chongqing, Peoples R China
[2] Wuhan Univ, Sch Comp, Wuhan, Peoples R China
[3] HenanAcad Sci, Aerosp Informat Res Inst, Zhengzhou, Peoples R China
基金
中国国家自然科学基金;
关键词
Lossless image compression; Hybrid-context; Multi-prior-based entropy model; ALGORITHM;
D O I
10.1016/j.patcog.2024.110632
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Lossless image compression is an essential aspect of image processing, particularly in many fields that require high information fidelity. In recent years, learned lossless image compression methods have shown promising results. However, many of these methods do not make optimal use of available information, leading to suboptimal performance. This paper proposes a multi -prior entropy model for lossless image compression, which effectively leverages available information to achieve better compression performance. The proposed multiprior comprises a cross -channel prior, hybrid local context, and hyperprior, allowing it to effectively utilize all available information. To remove redundancy across color channels, the original image is first losslessly transformed into YUV color space. The network then learns priors from the original image, the prior -coding channels, and the local context, which are fused to form the multi -prior used for GMM parameters estimation. Moreover, to capture the features of different images, a hybrid local context is abstracted using different kernel sizes of mask convolutions in a local context. The experimental results on several datasets demonstrate that our algorithm outperforms several existing learning -based image compression methods and traditional methods, such as JPEG2000, WebP, and FLIF.
引用
收藏
页数:12
相关论文
共 50 条
  • [1] An extended context-based entropy hybrid modeling for image compression
    Fu, Haisheng
    Liang, Feng
    Lei, Bo
    Zhang, Qiang
    Liang, Jie
    Tu, Chengjie
    Zhang, Guohe
    SIGNAL PROCESSING-IMAGE COMMUNICATION, 2021, 95
  • [2] Adaptive Prediction, Context Modeling, and Entropy Coding Methods for CALIC Lossless Image Compression
    Chang, Jer-Ming
    Ding, Jian-Jiun
    Lin, Heng-Sheng
    2019 IEEE ASIA PACIFIC CONFERENCE ON CIRCUITS AND SYSTEMS (APCCAS 2019), 2019, : 349 - 352
  • [3] Haze Image Restoration Based on Multi-Prior Constraints
    Qu Chen
    Bi Duyan
    LASER & OPTOELECTRONICS PROGRESS, 2020, 57 (18)
  • [4] Lossless image compression based on a fuzzy linear prediction with context based entropy coding
    Aiazzi, B
    Alba, PS
    Alparone, L
    Baronti, S
    1998 INTERNATIONAL CONFERENCE ON IMAGE PROCESSING - PROCEEDINGS, VOL 3, 1998, : 905 - 909
  • [5] Lossless interframe image compression via context modeling
    Wu, XL
    Choi, WK
    Memon, N
    DCC '98 - DATA COMPRESSION CONFERENCE, 1998, : 378 - 387
  • [6] CONTEXT-BASED LOSSLESS IMAGE COMPRESSION
    TISCHER, PE
    WORLEY, RT
    MAEDER, AJ
    GOODWIN, M
    COMPUTER JOURNAL, 1993, 36 (01): : 68 - 77
  • [7] Trends in lossless image compression: adaptive vs. classified prediction and context modeling for entropy coding
    Aiazzi, B
    Alparone, L
    Baronti, S
    MATHEMATICS OF DATA/IMAGE CODING, COMPRESSION,AND ENCRYPTION II, 1999, 3814 : 86 - 96
  • [8] Trends in lossless image compression: Adaptive vs. classified prediction and context modeling for entropy coding
    Aiazzi, Bruno
    Alparone, Luciano
    Baronti, Stefano
    Proceedings of SPIE - The International Society for Optical Engineering, 3814 : 86 - 96
  • [9] Context-based lossless halftone image compression
    Denecker, K
    Van Assche, S
    De Neve, P
    Lemahieu, I
    JOURNAL OF ELECTRONIC IMAGING, 1999, 8 (04) : 404 - 414
  • [10] Learning Context-Based Nonlocal Entropy Modeling for Image Compression
    Li, Mu
    Zhang, Kai
    Li, Jinxing
    Zuo, Wangmeng
    Timofte, Radu
    Zhang, David
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2023, 34 (03) : 1132 - 1145