A Combined Full-Reference Image Quality Assessment Method Based on Convolutional Activation Maps

被引:4
|
作者
Varga, Domonkos [1 ]
机构
[1] Budapest Univ Technol & Econ, Dept Networked Syst & Serv, H-1111 Budapest, Hungary
关键词
full-reference image quality assessment; deep learning; convolutional neural networks; SIMILARITY INDEX; DEVIATION; EFFICIENT;
D O I
10.3390/a13120313
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The goal of full-reference image quality assessment (FR-IQA) is to predict the perceptual quality of an image as perceived by human observers using its pristine (distortion free) reference counterpart. In this study, we explore a novel, combined approach which predicts the perceptual quality of a distorted image by compiling a feature vector from convolutional activation maps. More specifically, a reference-distorted image pair is run through a pretrained convolutional neural network and the activation maps are compared with a traditional image similarity metric. Subsequently, the resulting feature vector is mapped onto perceptual quality scores with the help of a trained support vector regressor. A detailed parameter study is also presented in which the design choices of the proposed method is explained. Furthermore, we study the relationship between the amount of training images and the prediction performance. Specifically, it is demonstrated that the proposed method can be trained with a small amount of data to reach high prediction performance. Our best proposal-called ActMapFeat-is compared to the state-of-the-art on six publicly available benchmark IQA databases, such as KADID-10k, TID2013, TID2008, MDID, CSIQ, and VCL-FER. Specifically, our method is able to significantly outperform the state-of-the-art on these benchmark databases.
引用
收藏
页数:21
相关论文
共 50 条
  • [31] A feature-level full-reference image denoising quality assessment method based on joint sparse representation
    Yanxiang Hu
    Bo Zhang
    Ya Zhang
    Chuan Jiang
    Zhijie Chen
    Applied Intelligence, 2022, 52 : 11115 - 11130
  • [32] Dynamic Receptive Field Generation for Full-Reference Image Quality Assessment
    Kim, Woojae
    Nguyen, Anh-Duc
    Lee, Sanghoon
    Bovik, Alan Conrad
    IEEE TRANSACTIONS ON IMAGE PROCESSING, 2020, 29 : 4219 - 4231
  • [33] Machine learning to design full-reference image quality assessment algorithm
    Charrier, Christophe
    Lezoray, Olivier
    Lebrun, Gilles
    SIGNAL PROCESSING-IMAGE COMMUNICATION, 2012, 27 (03) : 209 - 219
  • [34] Full-reference image quality assessment using statistical local correlation
    Ding, Yong
    Wang, Shaoze
    Zhang, Dong
    ELECTRONICS LETTERS, 2014, 50 (02) : 79 - 80
  • [35] Machine learning to design full-reference image quality assessment algorithm
    Ling, Wang Yu
    Hu, Yang
    Telkomnika - Indonesian Journal of Electrical Engineering, 2013, 11 (06): : 3439 - 3444
  • [36] A Full-Reference Image Quality Assessment Method via Deep Meta-Learning and Conformer
    Lang, Shujun
    Liu, Xu
    Zhou, Mingliang
    Luo, Jun
    Pu, Huayan
    Zhuang, Xu
    Wang, Jason
    Wei, Xuekai
    Zhang, Taiping
    Feng, Yong
    Shang, Zhaowei
    IEEE TRANSACTIONS ON BROADCASTING, 2024, 70 (01) : 316 - 324
  • [37] ASSP: An adaptive sample statistics-based pooling for full-reference image quality assessment
    Ling, Yurong
    Zhou, Fei
    Guo, Kun
    Xue, Jing-Hao
    Neurocomputing, 2022, 493 : 568 - 582
  • [38] A Full-Reference Image Quality Assessment Model Based on Quadratic Gradient Magnitude and LOG Signal
    Chen, Congmin
    Mou, Xuanqin
    IMAGE AND GRAPHICS, ICIG 2019, PT I, 2019, 11901 : 702 - 713
  • [39] A Novel Full-Reference Color Image Quality Assessment Based on Energy Computation in the Wavelet Domain
    Hanumantharaju, M.
    Ravishankar, M.
    Rameshbabu, D.
    Aradhya, V.
    JOURNAL OF INTELLIGENT SYSTEMS, 2013, 22 (02) : 155 - 177
  • [40] ASSP: An adaptive sample statistics-based pooling for full-reference image quality assessment
    Ling, Yurong
    Zhou, Fei
    Guo, Kun
    Xue, Jing-Hao
    NEUROCOMPUTING, 2022, 493 : 568 - 582