No-reference video quality assessment based on human visual perception

被引:0
|
作者
Zhou, Zhou [1 ]
Kong, Guangqian [1 ]
Duan, Xun [1 ]
Long, Huiyun [1 ]
机构
[1] Guizhou Univ, Coll Comp Sci & Technol, State Key Lab Publ Big Data, Guiyang, Peoples R China
基金
中国国家自然科学基金;
关键词
video quality assessment; UGC videos; human visual perception; attention;
D O I
10.1117/1.JEI.33.4.043029
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Conducting video quality assessment (VQA) for user-generated content (UGC) videos and achieving consistency with subjective quality assessment are highly challenging tasks. We propose a no-reference video quality assessment (NR-VQA) method for UGC scenarios by considering characteristics of human visual perception. To distinguish between varying levels of human attention within different regions of a single frame, we devise a dual-branch network. This network extracts spatial features containing positional information of moving objects from frame-level images. In addition, we employ the temporal pyramid pooling module to effectively integrate temporal features of different scales, enabling the extraction of inter-frame temporal information. To mitigate the time-lag effect in the human visual system, we introduce the temporal pyramid attention module. This module evaluates the significance of individual video frames and simulates the varying attention levels exhibited by humans towards frames. We conducted experiments on the KoNViD-1k, LIVE-VQC, CVD2014, and YouTube-UGC databases. The experimental results demonstrate the superior performance of our proposed method compared to recent NR-VQA techniques in terms of both objective assessment and consistency with subjective assessment. (c) 2024 SPIE and IS&T
引用
收藏
页数:15
相关论文
共 50 条
  • [41] A no-reference video quality assessment method based on digital watermark
    Yang, FZ
    Wang, XD
    Chang, YL
    Wan, S
    PIMRC 2003: 14TH IEEE 2003 INTERNATIONAL SYMPOSIUM ON PERSONAL, INDOOR AND MOBILE RADIO COMMUNICATIONS PROCEEDINGS, VOLS 1-3 2003, 2003, : 2707 - 2710
  • [42] ATTENTION BASED NETWORK FOR NO-REFERENCE UGC VIDEO QUALITY ASSESSMENT
    Yi, Fuwang
    Chen, Mianyi
    Sun, Wei
    Min, Xiongkuo
    Tian, Yuan
    Zhai, Guangtao
    2021 IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING (ICIP), 2021, : 1414 - 1418
  • [43] A Novel No-reference Objective Stereoscopic Video Quality Assessment Method Based on Visual Saliency Analysis
    Yang, Xinyan
    Zhao, Wei
    Ye, Long
    QinZhang
    NINTH INTERNATIONAL CONFERENCE ON DIGITAL IMAGE PROCESSING (ICDIP 2017), 2017, 10420
  • [44] No-Reference Video Quality Assessment based on Convolutional Neural Network and Human Temporal Behavior
    Ahn, Sewoong
    Lee, Sanghoon
    2018 ASIA-PACIFIC SIGNAL AND INFORMATION PROCESSING ASSOCIATION ANNUAL SUMMIT AND CONFERENCE (APSIPA ASC), 2018, : 1513 - 1517
  • [45] A No-Reference Image and Video Visual Quality Metric Based on Machine Learning
    Frantc, Vladimir
    Voronin, Viacheslav
    Semenishchev, Evgenii
    Minkin, Maxim
    Delov, Aliy
    TENTH INTERNATIONAL CONFERENCE ON MACHINE VISION (ICMV 2017), 2018, 10696
  • [46] No-reference screen content video quality assessment
    Li, Teng
    Min, Xiongkuo
    Zhu, Wenhan
    Xu, Yiling
    Zhang, Wenjun
    Displays, 2021, 69
  • [47] No-reference Video Quality Assessment on Mobile Devices
    Chen, Chen
    Song, Li
    Wang, Xiangwen
    Guo, Meng
    2013 IEEE INTERNATIONAL SYMPOSIUM ON BROADBAND MULTIMEDIA SYSTEMS AND BROADCASTING (BMSB), 2013,
  • [48] No-reference image quality assessment method based on visual parameters
    Liu Y.-H.
    Yang K.-F.
    Yan H.-M.
    Journal of Electronic Science and Technology, 2019, 17 (02) : 171 - 184
  • [49] Analysis and Modelling of No-Reference Video Quality Assessment
    Tian, Yuan
    Zhu, Ming
    2009 INTERNATIONAL CONFERENCE ON COMPUTER AND AUTOMATION ENGINEERING, PROCEEDINGS, 2009, : 108 - 112
  • [50] No-reference screen content video quality assessment
    Li, Teng
    Min, Xiongkuo
    Zhu, Wenhan
    Xu, Yiling
    Zhang, Wenjun
    DISPLAYS, 2021, 69