Assessing Interactive Gaming Quality of Experience Using a Crowdsourcing Approach

被引:2
|
作者
Schmidt, Steven [1 ]
Naderi, Babak [1 ]
Sabet, Saeed Shafiee [1 ,3 ]
Zadtootaghaj, Saman [1 ]
Moeller, Sebastian [1 ,2 ]
机构
[1] Tech Univ Berlin, Qual & Usabil Lab, Berlin, Germany
[2] DFKI Projektburo Berlin, Berlin, Germany
[3] SimulaMet Oslo, Oslo, Norway
来源
2020 TWELFTH INTERNATIONAL CONFERENCE ON QUALITY OF MULTIMEDIA EXPERIENCE (QOMEX) | 2020年
关键词
crowdsourcing; gaming; QoE; evaluation;
D O I
10.1109/qomex48832.2020.9123122
中图分类号
TP39 [计算机的应用];
学科分类号
081203 ; 0835 ;
摘要
Traditionally, the Quality of Experience (QoE) is assessed in a controlled laboratory environment where participants give their opinion about the perceived quality of a stimulus on a standardized rating scale. Recently, the usage of crowdsourcing micro-task platforms for assessing the media quality is increasing. The crowdsourcing platforms provide access to a pool of geographically distributed, and demographically diverse group of workers who participate in the experiment in their own working environment and using their own hardware. The main challenge in crowdsourcing QoE tests is to control the effect of interfering influencing factors such as a user's environment and device on the subjective ratings. While in the past, the crowdsourcing approach was frequently used for speech and video quality assessment, research on a quality assessment for gaming services is rare. In this paper, we present a method to measure gaming QoE under typically considered system influence factors including delay, packet loss, and framerates as well as different game designs. The factors are artificially manipulated due to controlled changes in the implementation of games. The results of a total of five studies using a developed evaluation method based on a combination of the ITU-T Rec. P.809 on subjective evaluation methods for gaming quality and the ITUT Rec. P.808 on subjective evaluation of speech quality with a crowdsourcing approach will be discussed. To evaluate the reliability and validity of results collected using this method, we finally compare subjective ratings regarding the effect of network delay on gaming QoE gathered from interactive crowdsourcing tests with those from equivalent laboratory experiments.
引用
收藏
页数:6
相关论文
共 50 条
  • [1] Interactive Assessment of Gaming QoE Using Crowdsourcing
    Schmidt, Steven
    T-Labs Series in Telecommunication Services, 2023, : 97 - 128
  • [2] The 'Interactive' of Interactive Storytelliling: Customizing the Gaming Experience
    Bostan, Barbaros
    Marsh, Tim
    ENTERTAINMENT COMPUTING - ICEC 2010, 2010, 6243 : 472 - +
  • [3] SwifTree: Interactive Extraction of 3D Trees Supporting Gaming and Crowdsourcing
    Huang, Mian
    Hamarneh, Ghassan
    INTRAVASCULAR IMAGING AND COMPUTER ASSISTED STENTING, AND LARGE-SCALE ANNOTATION OF BIOMEDICAL DATA AND EXPERT LABEL SYNTHESIS, 2017, 10552 : 116 - 125
  • [4] Improving Quality of Experience in Cloud Gaming Using Speculative Execution
    Ishioka, Takumasa
    Fukui, Tatsuya
    Tsugami, Ryouhei
    Fujiwara, Toshihito
    Narikawa, Satoshi
    Fujihashi, Takuya
    Saruwatari, Shunsuke
    Watanabe, Takashi
    2023 FOURTEENTH INTERNATIONAL CONFERENCE ON MOBILE COMPUTING AND UBIQUITOUS NETWORK, ICMU, 2023,
  • [5] Crowdsourcing Quality-of-Experience Assessments
    Hossfeld, Tobias
    Keimel, Christian
    Timmerer, Christian
    COMPUTER, 2014, 47 (09) : 98 - 102
  • [6] Assessing the Quality of Wikipedia Editors through Crowdsourcing
    Suzuki, Yu
    Nakamura, Satoshi
    PROCEEDINGS OF THE 25TH INTERNATIONAL CONFERENCE ON WORLD WIDE WEB (WWW'16 COMPANION), 2016, : 1001 - 1006
  • [7] Crowdsourcing interactions: using crowdsourcing for evaluating interactive information retrieval systems
    Zuccon, Guido
    Leelanupab, Teerapong
    Whiting, Stewart
    Yilmaz, Emine
    Jose, Joemon M.
    Azzopardi, Leif
    INFORMATION RETRIEVAL, 2013, 16 (02): : 267 - 305
  • [8] Assessing Crowdsourcing Quality through Objective Tasks
    Aker, Ahmet
    El-Haj, Mahmoud
    Albakour, M-Dyaa
    Kruschwitz, Udo
    LREC 2012 - EIGHTH INTERNATIONAL CONFERENCE ON LANGUAGE RESOURCES AND EVALUATION, 2012, : 1456 - 1461
  • [9] Crowdsourcing interactions: using crowdsourcing for evaluating interactive information retrieval systems
    Guido Zuccon
    Teerapong Leelanupab
    Stewart Whiting
    Emine Yilmaz
    Joemon M. Jose
    Leif Azzopardi
    Information Retrieval, 2013, 16 : 267 - 305
  • [10] Interactive Refinement of Linked Data: Toward a Crowdsourcing Approach
    Roengsamut, Boonsita
    Kuwabara, Kazuhiro
    INTELLIGENT INFORMATION AND DATABASE SYSTEMS, PT I, 2015, 9011 : 3 - 12