Efficient and adaptive incentive selection for crowdsourcing contests

被引:0
|
作者
Nhat Van-Quoc Truong
Le Cong Dinh
Sebastian Stein
Long Tran-Thanh
Nicholas R. Jennings
机构
[1] University of Southampton,Electronics and Computer Science
[2] University of Warwick,Department of Computer Science
[3] Loughborough University,undefined
来源
Applied Intelligence | 2023年 / 53卷
关键词
Incentive; Crowdsourcing; Online decision making;
D O I
暂无
中图分类号
学科分类号
摘要
The success of crowdsourcing projects relies critically on motivating a crowd to contribute. One particularly effective method for incentivising participants to perform tasks is to run contests where participants compete against each other for rewards. However, there are numerous ways to implement such contests in specific projects, that vary in how performance is evaluated, how participants are rewarded, and the sizes of the prizes. Also, the best way to implement contests in a particular project is still an open challenge, as the effectiveness of each contest implementation (henceforth, incentive) is unknown in advance. Hence, in a crowdsourcing project, a practical approach to maximise the overall utility of the requester (which can be measured by the total number of completed tasks or the quality of the task submissions) is to choose a set of incentives suggested by previous studies from the literature or from the requester’s experience. Then, an effective mechanism can be applied to automatically select appropriate incentives from this set over different time intervals so as to maximise the cumulative utility within a given financial budget and a time limit. To this end, we present a novel approach to this incentive selection problem. Specifically, we formalise it as an online decision making problem, where each action corresponds to offering a specific incentive. After that, we detail and evaluate a novel algorithm, HAIS, to solve the incentive selection problem efficiently and adaptively. In theory, in the case that all the estimates in HAIS (except the estimates of the effectiveness of each incentive) are correct, we show that the algorithm achieves the regret bound of O(B/c)\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$\mathcal {O}(\sqrt {B/c})$\end{document}, where B denotes the financial budget and c is the average cost of the incentives. In experiments, the performance of HAIS is about 93% (up to 98%) of the optimal solution and about 9% (up to 40%) better than state-of-the-art algorithms in a broad range of settings, which vary in budget sizes, time limits, numbers of incentives, values of the standard deviation of the incentives’ utilities, and group sizes of the contests (i.e., the numbers of participants in a contest).
引用
收藏
页码:9204 / 9234
页数:30
相关论文
共 50 条
  • [41] Towards an Understanding of Participants' Sustained Participation in Crowdsourcing Contests
    Wang, Xuan
    Khasraghi, Hanieh Javadi
    Schneider, Helmut
    INFORMATION SYSTEMS MANAGEMENT, 2020, 37 (03) : 213 - 226
  • [42] A worker-selection incentive mechanism for optimizing platform-centric mobile crowdsourcing systems
    Wang, Yingjie
    Gao, Yang
    Li, Yingshu
    Tong, Xiangrong
    COMPUTER NETWORKS, 2020, 171
  • [43] Optimal two-stage elimination contests for crowdsourcing
    Hou, Ting
    Zhang, Wen
    TRANSPORTATION RESEARCH PART E-LOGISTICS AND TRANSPORTATION REVIEW, 2021, 145
  • [44] Winning by Learning? Effect of Knowledge Sharing in Crowdsourcing Contests
    Jin, Yuan
    Lee, Ho Cheung Brian
    Ba, Sulin
    Stallaert, Jan
    INFORMATION SYSTEMS RESEARCH, 2021, 32 (03) : 836 - 859
  • [45] Contribution to team and community in crowdsourcing contests: a qualitative investigation
    Khasraghi, Hanieh Javadi
    Vaghefi, Isaac
    Hirschheim, Rudy
    INFORMATION TECHNOLOGY & PEOPLE, 2024, 37 (01) : 223 - 250
  • [46] Impact of Test Condition Selection in Adaptive Crowdsourcing Studies on Subjective Quality
    Seufert, Michael
    Zach, Ondrej
    Hossfeld, Tobias
    Slanina, Martin
    Phuoc Tran-Gia
    2016 EIGHTH INTERNATIONAL CONFERENCE ON QUALITY OF MULTIMEDIA EXPERIENCE (QOMEX), 2016,
  • [47] Volunteer Selection in Collaborative Crowdsourcing with Adaptive Common Working Time Slots
    Samanta, Riya
    Saxena, Vaibhav
    Ghosh, Soumya K.
    Das, Sajal K.
    2022 IEEE GLOBAL COMMUNICATIONS CONFERENCE (GLOBECOM 2022), 2022, : 4643 - 4648
  • [48] Incentive-Based Crowdsourcing of Hotspot Services
    Neiat, Azadeh Ghari
    Bouguettaya, Athman
    Mistry, Sajib
    ACM TRANSACTIONS ON INTERNET TECHNOLOGY, 2019, 19 (01)
  • [50] Double or Nothing: Multiplicative Incentive Mechanisms for Crowdsourcing
    Shah, Nihar B.
    Zhou, Dengyong
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 28 (NIPS 2015), 2015, 28