An evaluation framework for software crowdsourcing

被引:0
|
作者
Wenjun Wu
Wei-Tek Tsai
Wei Li
机构
[1] Beihang University,State Key Laboratory of Software Development Environment
[2] Arizona State University,School of Computing, Informatics, and Decision Systems Engineering
[3] Tsinghua University,Department of Computer Science and Technology, INLIST
来源
关键词
crowdsourcing; software engineering; competition rules; game theory;
D O I
暂无
中图分类号
学科分类号
摘要
Recently software crowdsourcing has become an emerging area of software engineering. Few papers have presented a systematic analysis on the practices of software crowdsourcing. This paper first presents an evaluation framework to evaluate software crowdsourcing projects with respect to software quality, costs, diversity of solutions, and competition nature in crowdsourcing. Specifically, competitions are evaluated by the min-max relationship from game theory among participants where one party tries to minimize an objective function while the other party tries to maximize the same objective function. The paper then defines a game theory model to analyze the primary factors in these minmax competition rules that affect the nature of participation as well as the software quality. Finally, using the proposed evaluation framework, this paper illustrates two crowdsourcing processes, Harvard-TopCoder and AppStori. The framework demonstrates the sharp contrasts between both crowdsourcing processes as participants will have drastic behaviors in engaging these two projects.
引用
收藏
页码:694 / 709
页数:15
相关论文
共 50 条
  • [31] Evaluation framework for open source software
    Koponen, T
    Hotti, V
    SERP'04: PROCEEDINGS OF THE INTERNATIONAL CONFERENCE ON SOFTWARE ENGINEERING RESEARCH AND PRACTICE, VOLS 1 AND 2, 2004, : 897 - 902
  • [32] Application of Crowdsourcing in Software Development
    Suganthy, A.
    Chithralekha, T.
    2016 5TH INTERNATIONAL CONFERENCE ON RECENT TRENDS IN INFORMATION TECHNOLOGY (ICRTIT), 2016,
  • [33] Dynamics of Software Development Crowdsourcing
    Dubey, Alpana
    Abhinav, Kumar
    Taneja, Sakshi
    Virdi, Gurdeep
    Dwarakanath, Anurag
    Kass, Alex
    Kuriakose, Mani Suma
    2016 IEEE 11TH INTERNATIONAL CONFERENCE ON GLOBAL SOFTWARE ENGINEERING (ICGSE), 2016, : 49 - 58
  • [34] On moderating software crowdsourcing challenges
    de Souza, Cleidson R.B.
    Machado, Leticia S.
    Melo, Ricardo Rodrigo M.
    1600, Association for Computing Machinery, 2 Penn Plaza, Suite 701, New York, NY 10121-0701, United States (04):
  • [35] Disruption and Deception in Crowdsourcing: Towards a Crowdsourcing Risk Framework
    Onuchowska, Agnieszka
    de Vreede, Gert-Jan
    PROCEEDINGS OF THE 51ST ANNUAL HAWAII INTERNATIONAL CONFERENCE ON SYSTEM SCIENCES (HICSS), 2018, : 3966 - 3975
  • [36] Investigating Software Standards: A Lens of Sustainability for Software Crowdsourcing
    Malik, Muhammad Noman
    Khan, Huma Hayat
    IEEE ACCESS, 2018, 6 : 5139 - 5150
  • [37] A Framework for Using Crowdsourcing in Government
    Clark, Benjamin Y.
    Zingale, Nicholas
    Logan, Joseph
    Brudney, Jeffrey
    INTERNATIONAL JOURNAL OF PUBLIC ADMINISTRATION IN THE DIGITAL AGE, 2016, 3 (04) : 57 - 75
  • [38] A programming framework for Spatial Crowdsourcing
    Fonteles, Andre Sales
    Bouveret, Sylvain
    Gensel, Jerome
    MOMM 2017: THE 15TH INTERNATIONAL CONFERENCE ON ADVANCES IN MOBILE COMPUTING & MULTIMEDIA, 2017, : 131 - 140
  • [39] Framework supporting software assets evaluation on trustworthiness
    Cai S.-B.
    Zou Y.-Z.
    Shao L.-S.
    Xie B.
    Shao W.-Z.
    Ruan Jian Xue Bao/Journal of Software, 2010, 21 (02): : 359 - 372
  • [40] Theoretical Framework of Technical Kinematics Evaluation Software
    Xu Xiaofeng
    Di Jianyong
    Jing Lixian
    Gao Yansong
    INFORMATION COMPUTING AND APPLICATIONS, ICICA 2013, PT I, 2013, 391 : 304 - 313