HULDRA: A Framework for Collecting Crowdsourced Feedback on Multimedia Assets

被引:2
|
作者
Hammou, Malek [1 ]
Midoglu, Cise [1 ]
Hicks, Steven A. [1 ]
Storas, Andrea [1 ,2 ]
Sabet, Saeed Shafiee [1 ]
Strumke, Inga [1 ]
Riegler, Michael A. [1 ,3 ]
Halvorsen, Pal [1 ,2 ]
机构
[1] SimulaMet, Oslo, Norway
[2] Oslo Metropolitan Univ, Oslo, Norway
[3] UIT Arctic Univ Norway, Tromso, Norway
关键词
crowdsourced feedback; multimedia content; open source; survey; UI; user study; web application;
D O I
10.1145/3524273.3532887
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
0812 ;
摘要
Collecting crowdsourced feedback to evaluate, rank, or score multimedia content can be cumbersome and time-consuming. Most of the existing survey tools are complicated, hard to customize, or tailored for a specific asset type. In this paper, we present an open source framework called Huldra, designed explicitly to address the challenges associated with user studies involving crowdsourced feedback collection. The web-based framework is built in a modular and configurable fashion to allow for the easy adjustment of the user interface (UI) and the multimedia content, while providing integrations with reliable and stable backend solutions to facilitate the collection and analysis of responses. Our proposed framework can be used as an online survey tool by researchers working on different topics such as Machine Learning (ML), audio, image, and video quality assessment, Quality of Experience (QoE), and require user studies for the benchmarking of various types of multimedia content.
引用
收藏
页码:203 / 209
页数:7
相关论文
共 50 条
  • [1] Improving Consistency of Crowdsourced Multimedia Similarity for Evaluation
    Organisciak, Peter
    Downie, J. Stephen
    PROCEEDINGS OF THE 15TH ACM/IEEE-CS JOINT CONFERENCE ON DIGITAL LIBRARIES (JCDL'15), 2015, : 115 - 118
  • [2] Crowdsourced Argumentation Feedback for Persuasive Writing
    Ihoriya, Hiroki
    Yamamoto, Yusuke
    HUMAN INTERFACE AND THE MANAGEMENT OF INFORMATION, HIMI 2023, PT II, 2023, 14016 : 461 - 475
  • [3] Supporting Creative Workers with Crowdsourced Feedback
    Oppenlaender, Jonas
    PROCEEDINGS OF THE 2019 ON CREATIVITY AND COGNITION - C&C 19, 2019, : 646 - 652
  • [4] Collecting Client Feedback
    Lambert, Michael J.
    Shimokawa, Kenichi
    PSYCHOTHERAPY, 2011, 48 (01) : 72 - 79
  • [5] Collecting data for multimedia management approaches
    Phillips, JB
    Hindawi, M
    Phillips, A
    Demonsabert, S
    Bailey, RV
    POLLUTION ENGINEERING, 1998, 30 (03) : 37 - 40
  • [6] A Multimedia Retrieval Framework Based on Semi-Supervised Ranking and Relevance Feedback
    Yang, Yi
    Nie, Feiping
    Xu, Dong
    Luo, Jiebo
    Zhuang, Yueting
    Pan, Yunhe
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2012, 34 (04) : 723 - 742
  • [7] Policy for sustainable entrepreneurship: A crowdsourced framework
    Watson, Rosina
    Nielsen, Kristian Roed
    Wilson, Hugh N.
    Macdonald, Emma K.
    Mera, Christine
    Reisch, Lucia
    JOURNAL OF CLEANER PRODUCTION, 2023, 283
  • [8] Cloud-Assisted Live Streaming for Crowdsourced Multimedia Content
    Chen, Fei
    Zhang, Cong
    Wang, Feng
    Liu, Jiangchuan
    Wang, Xiaofeng
    Liu, Yuan
    IEEE TRANSACTIONS ON MULTIMEDIA, 2015, 17 (09) : 1471 - 1483
  • [9] Collecting Middle-Class Memories? The Pandemic, Technology, and Crowdsourced Archives
    Zumthurm, Tizian
    Krebs, Stefan
    TECHNOLOGY AND CULTURE, 2022, 63 (02) : 483 - 493
  • [10] Crowdsourced morphometrics: a novel method to overcome bottlenecks in collecting phenotype data
    Chang, J.
    Rabosky, D. L.
    Alfaro, M. E.
    INTEGRATIVE AND COMPARATIVE BIOLOGY, 2014, 54 : E35 - E35