Efficient and adaptive incentive selection for crowdsourcing contests

被引:0
|
作者
Nhat Van-Quoc Truong
Le Cong Dinh
Sebastian Stein
Long Tran-Thanh
Nicholas R. Jennings
机构
[1] University of Southampton,Electronics and Computer Science
[2] University of Warwick,Department of Computer Science
[3] Loughborough University,undefined
来源
Applied Intelligence | 2023年 / 53卷
关键词
Incentive; Crowdsourcing; Online decision making;
D O I
暂无
中图分类号
学科分类号
摘要
The success of crowdsourcing projects relies critically on motivating a crowd to contribute. One particularly effective method for incentivising participants to perform tasks is to run contests where participants compete against each other for rewards. However, there are numerous ways to implement such contests in specific projects, that vary in how performance is evaluated, how participants are rewarded, and the sizes of the prizes. Also, the best way to implement contests in a particular project is still an open challenge, as the effectiveness of each contest implementation (henceforth, incentive) is unknown in advance. Hence, in a crowdsourcing project, a practical approach to maximise the overall utility of the requester (which can be measured by the total number of completed tasks or the quality of the task submissions) is to choose a set of incentives suggested by previous studies from the literature or from the requester’s experience. Then, an effective mechanism can be applied to automatically select appropriate incentives from this set over different time intervals so as to maximise the cumulative utility within a given financial budget and a time limit. To this end, we present a novel approach to this incentive selection problem. Specifically, we formalise it as an online decision making problem, where each action corresponds to offering a specific incentive. After that, we detail and evaluate a novel algorithm, HAIS, to solve the incentive selection problem efficiently and adaptively. In theory, in the case that all the estimates in HAIS (except the estimates of the effectiveness of each incentive) are correct, we show that the algorithm achieves the regret bound of O(B/c)\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$\mathcal {O}(\sqrt {B/c})$\end{document}, where B denotes the financial budget and c is the average cost of the incentives. In experiments, the performance of HAIS is about 93% (up to 98%) of the optimal solution and about 9% (up to 40%) better than state-of-the-art algorithms in a broad range of settings, which vary in budget sizes, time limits, numbers of incentives, values of the standard deviation of the incentives’ utilities, and group sizes of the contests (i.e., the numbers of participants in a contest).
引用
收藏
页码:9204 / 9234
页数:30
相关论文
共 50 条
  • [31] Movement-Based Incentive for Crowdsourcing
    Tian, Feng
    Liu, Bo
    Sun, Xiao
    Zhang, Xiaomei
    Cao, Guohong
    Gui, Lin
    IEEE TRANSACTIONS ON VEHICULAR TECHNOLOGY, 2017, 66 (08) : 7223 - 7233
  • [32] A Text Mining Approach to Evaluate Submissions to Crowdsourcing Contests
    Walter, Thomas P.
    Back, Andrea
    PROCEEDINGS OF THE 46TH ANNUAL HAWAII INTERNATIONAL CONFERENCE ON SYSTEM SCIENCES, 2013, : 3109 - 3118
  • [33] An Efficient Decision Support System for the Selection of Appropriate Crowd in Crowdsourcing
    Huang, Yongjun
    Nazir, Shah
    Wu, Jiyu
    Hussain Khoso, Fida
    Ali, Farhad
    Khan, Habib Ullah
    COMPLEXITY, 2021, 2021
  • [34] An Examination of the Dynamics of Crowdsourcing Contests: Role of Feedback Type
    Sanyal, Pallab
    Ye, Shun
    INFORMATION SYSTEMS RESEARCH, 2024, 35 (01) : 394 - 413
  • [35] Crowdsourcing in Software Development: Empirical Support for Configuring Contests
    Bibi, Stamatia
    Zozas, Ioannis
    Ampatzoglou, Apostolos
    Sarigiannidis, Panagiotis G.
    Kalampokis, George
    Stamelos, Ioannis
    IEEE ACCESS, 2020, 8 (08): : 58094 - 58117
  • [36] Sustaining Participant Involvement in Crowdsourcing Contests through Collaboration
    Khasraghi, Hanieh Javadi
    Wang, Xuan
    Hirschheim, Rudy
    AMCIS 2017 PROCEEDINGS, 2017,
  • [37] Seeker Exemplars and Quantitative Ideation Outcomes in Crowdsourcing Contests
    Koh, Tat Koon
    Cheung, Muller Y. M.
    INFORMATION SYSTEMS RESEARCH, 2022, 33 (01) : 265 - 284
  • [38] Parallel Contests for Crowdsourcing Reviews: Existence and Quality of Equilibria
    Birmpas, Georgios
    Kovalchuk, Lyudmila
    Lazos, Philip
    Oliynykov, Roman
    PROCEEDINGS OF THE 2022 4TH ACM CONFERENCE ON ADVANCES IN FINANCIAL TECHNOLOGIES, AFT 2022, 2022, : 268 - 280
  • [39] Winning by learning? Effect of knowledge sharing in crowdsourcing contests
    Jin Y.
    Cheung H.
    Lee B.
    Ba S.
    Stallaert J.
    Inf. Syst. Res., 2021, 3 (836-859): : 836 - 859
  • [40] Crowd Size and Crowdsourcing Performances in Online Ideation Contests
    Bettiga, Debora
    Lamberti, Lucio
    2019 16TH INTERNATIONAL CONFERENCE ON SERVICE SYSTEMS AND SERVICE MANAGEMENT (ICSSSM2019), 2019,