Assessing Interactive Gaming Quality of Experience Using a Crowdsourcing Approach

被引:2
|
作者
Schmidt, Steven [1 ]
Naderi, Babak [1 ]
Sabet, Saeed Shafiee [1 ,3 ]
Zadtootaghaj, Saman [1 ]
Moeller, Sebastian [1 ,2 ]
机构
[1] Tech Univ Berlin, Qual & Usabil Lab, Berlin, Germany
[2] DFKI Projektburo Berlin, Berlin, Germany
[3] SimulaMet Oslo, Oslo, Norway
来源
2020 TWELFTH INTERNATIONAL CONFERENCE ON QUALITY OF MULTIMEDIA EXPERIENCE (QOMEX) | 2020年
关键词
crowdsourcing; gaming; QoE; evaluation;
D O I
10.1109/qomex48832.2020.9123122
中图分类号
TP39 [计算机的应用];
学科分类号
081203 ; 0835 ;
摘要
Traditionally, the Quality of Experience (QoE) is assessed in a controlled laboratory environment where participants give their opinion about the perceived quality of a stimulus on a standardized rating scale. Recently, the usage of crowdsourcing micro-task platforms for assessing the media quality is increasing. The crowdsourcing platforms provide access to a pool of geographically distributed, and demographically diverse group of workers who participate in the experiment in their own working environment and using their own hardware. The main challenge in crowdsourcing QoE tests is to control the effect of interfering influencing factors such as a user's environment and device on the subjective ratings. While in the past, the crowdsourcing approach was frequently used for speech and video quality assessment, research on a quality assessment for gaming services is rare. In this paper, we present a method to measure gaming QoE under typically considered system influence factors including delay, packet loss, and framerates as well as different game designs. The factors are artificially manipulated due to controlled changes in the implementation of games. The results of a total of five studies using a developed evaluation method based on a combination of the ITU-T Rec. P.809 on subjective evaluation methods for gaming quality and the ITUT Rec. P.808 on subjective evaluation of speech quality with a crowdsourcing approach will be discussed. To evaluate the reliability and validity of results collected using this method, we finally compare subjective ratings regarding the effect of network delay on gaming QoE gathered from interactive crowdsourcing tests with those from equivalent laboratory experiments.
引用
收藏
页数:6
相关论文
共 50 条
  • [21] Crowdsourcing critical appraisal of research evidence (CrowdCARE) was found to be a valid approach to assessing clinical research quality
    Pianta, Michael J.
    Makrai, Eve
    Verspoor, Karin M.
    Cohn, Trevor A.
    Downie, Laura E.
    JOURNAL OF CLINICAL EPIDEMIOLOGY, 2018, 104 : 8 - 14
  • [22] Influence Of Experience Level On Physical Activity During Interactive Video Gaming
    Kraft, Justin A.
    Russell, William D.
    Clark, Nathan
    Helm, Jessica
    Jackson, Amanda
    MEDICINE AND SCIENCE IN SPORTS AND EXERCISE, 2014, 46 (05): : 842 - 842
  • [23] EYEORG: A Platform For Crowdsourcing Web Quality Of Experience Measurements
    Varvello, Matteo
    Blackburn, Jeremy
    Naylor, David
    Papagiannaki, Konstantina
    PROCEEDINGS OF THE 12TH INTERNATIONAL CONFERENCE ON EMERGING NETWORKING EXPERIMENTS AND TECHNOLOGIES (CONEXT'16), 2016, : 399 - 412
  • [24] Influence of Experience Level on Physical Activity During Interactive Video Gaming
    Kraft, Justin A.
    Russell, William D.
    Clark, Nathan
    Helm, Jessica
    Jackson, Amanda
    JOURNAL OF PHYSICAL ACTIVITY & HEALTH, 2015, 12 (06): : 794 - 800
  • [25] Development and Challenges of Crowdsourcing Quality of Experience Evaluation for Multimedia
    Wang, Zhenji
    Tao, Dan
    Liu, Pingping
    BIG DATA COMPUTING AND COMMUNICATIONS, 2015, 9196 : 444 - 452
  • [26] Kaleidoscope: A Crowdsourcing Testing Tool for Web Quality of Experience
    Wang, Pengfei
    Varvello, Matteo
    Kuzmanovic, Aleksandar
    2019 39TH IEEE INTERNATIONAL CONFERENCE ON DISTRIBUTED COMPUTING SYSTEMS (ICDCS 2019), 2019, : 1971 - 1982
  • [27] A Crowdsourcing Approach for Quality Enhancement of eLearning Systems
    Mohan, Lalit S.
    Raman, Priya
    Choppella, Venkatesh
    Reddy, Y. R.
    PROCEEDINGS OF THE 10TH INNOVATIONS IN SOFTWARE ENGINEERING CONFERENCE, 2017, : 188 - 194
  • [28] Subjective Quality Evaluations Using Crowdsourcing
    Salas, Oscar Figuerola
    Adzic, Velibor
    Kalva, Hari
    2013 PICTURE CODING SYMPOSIUM (PCS), 2013, : 418 - 421
  • [29] Insight into the patient experience of eczema through a crowdsourcing approach
    Wang, Annie R.
    Qureshi, Abrar A.
    Drucker, Aaron M.
    JOURNAL OF ALLERGY AND CLINICAL IMMUNOLOGY-IN PRACTICE, 2017, 5 (03): : 861 - 863
  • [30] Quality of experience (QoE) in cloud gaming models: A review
    Laghari, Asif Ali
    He, Hui
    Memon, Kamran Ali
    Laghari, Rashid Ali
    Halepoto, Imtiaz Ali
    Khan, Asiya
    MULTIAGENT AND GRID SYSTEMS, 2019, 15 (03) : 289 - 304