An Examination of the Dynamics of Crowdsourcing Contests: Role of Feedback Type

被引:3
|
作者
Sanyal, Pallab [1 ]
Ye, Shun [1 ]
机构
[1] George Mason Univ, Sch Business, Fairfax, VA 22030 USA
关键词
crowdsourcing; design contests; feedback; convergence; diversity; INNOVATION CONTESTS; PRODUCTIVITY LOSS; OUTCOME-FEEDBACK; CREATIVITY; IDEA; PARTICIPATION; UNCERTAINTY; GENERATION; BEHAVIOR; QUALITY;
D O I
10.1287/isre.2023.1232
中图分类号
G25 [图书馆学、图书馆事业]; G35 [情报学、情报工作];
学科分类号
1205 ; 120501 ;
摘要
As more businesses are turning to crowdsourcing platforms for solutions to business problems, determining how to manage the sourcing contests based on their objectives has become critically important. Existing research, both theoretical and empirical, studies the impact of a variety of contest and contestant characteristics on the outcomes of these contests. Aside from these static design parameters, a lever organizations (clients) can use to dynamically steer contests toward desirable goals is the feedback offered to the contestants (solvers) during the contest. Although a handful of recent studies focuses on the effects of feedback at a high level (e.g., volume, valence), to the best of our knowledge, none has examined the effects of the information contained in the feedback. Furthermore, the focus of the existing studies is solely on the quality of the submissions and not on other critical contest outcomes, such as the diversity of the submissions, which is found to be significant in the creativity and innovations literature. In this study, first, using the psychology literature on the theory of feedback intervention, we classify client feedback into two types: outcome and process. Second, using data from almost 12,000 design contests, we empirically examine the effects of the two types of feedback on the convergence and diversity of submissions following feedback interventions. We find that process feedback, providing goal-oriented information to solvers, fosters convergent thinking, leading to submissions that are similar. Although outcome feedback lacks the informative value of process feedback, it encourages divergent thinking, which is the ability to produce a variety of solutions to a problem. Furthermore, we find that the effects are strengthened when the feedback is provided earlier in the contest rather than later. Based on our findings, we offer insights on how practitioners can strategically use an appropriate form of feedback to either generate greater diversity of solutions or efficient convergence to an acceptable solution.
引用
收藏
页码:394 / 413
页数:21
相关论文
共 50 条
  • [21] Task Design, Motivation, and Participation in Crowdsourcing Contests
    Zheng, Haichao
    Li, Dahui
    Hou, Wenhua
    INTERNATIONAL JOURNAL OF ELECTRONIC COMMERCE, 2011, 15 (04) : 57 - 88
  • [22] What Sustains Individuals' Participation in Crowdsourcing Contests?
    Wang, Xuan
    Khasraghi, Hanieh Javadi
    Schneider, Helmut
    PROCEEDINGS OF THE 52ND ANNUAL HAWAII INTERNATIONAL CONFERENCE ON SYSTEM SCIENCES, 2019, : 136 - 145
  • [23] Efficient and adaptive incentive selection for crowdsourcing contests
    Nhat Van-Quoc Truong
    Le Cong Dinh
    Stein, Sebastian
    Long Tran-Thanh
    Jennings, Nicholas R.
    APPLIED INTELLIGENCE, 2023, 53 (08) : 9204 - 9234
  • [24] KNOW WHEN TO RUN: RECOMMENDATIONS IN CROWDSOURCING CONTESTS
    Mo, Jiahui
    Sarkar, Sumit
    Menon, Syam
    MIS QUARTERLY, 2018, 42 (03) : 919 - +
  • [25] Social mechanisms in crowdsourcing contests: a literature review
    Jain, Shilpi
    Deodhar, Swanand J.
    BEHAVIOUR & INFORMATION TECHNOLOGY, 2022, 41 (05) : 1080 - 1114
  • [26] Attracting solutions in crowdsourcing contests: The role of knowledge distance, identity disclosure, and seeker status
    Pollok, Patrick
    Luettgens, Dirk
    Piller, Frank T.
    RESEARCH POLICY, 2019, 48 (01) : 98 - 114
  • [27] Efficient and adaptive incentive selection for crowdsourcing contests
    Nhat Van-Quoc Truong
    Le Cong Dinh
    Sebastian Stein
    Long Tran-Thanh
    Nicholas R. Jennings
    Applied Intelligence, 2023, 53 : 9204 - 9234
  • [28] Joint vs. Separate Crowdsourcing Contests
    Hu, Ming
    Wang, Lu
    MANAGEMENT SCIENCE, 2021, 67 (05) : 2711 - 2728
  • [29] Optimal Feedback in Contests
    Ely, Jeffrey C.
    Georgiadis, George
    Khorasani, Sina
    Rayo, Luis
    REVIEW OF ECONOMIC STUDIES, 2023, 90 (05): : 2370 - 2394
  • [30] Submitting tentative solutions for platform feedback in crowdsourcing contests: breaking network closure with boundary spanning for team performance
    Khasraghi, Hanieh Javadi
    Wang, Xuan
    Sun, Jun
    Khasraghi, Bahar Javadi
    INFORMATION TECHNOLOGY & PEOPLE, 2023, 36 (06) : 2189 - 2210