An evaluation framework for software crowdsourcing

被引:0
|
作者
Wenjun Wu
Wei-Tek Tsai
Wei Li
机构
[1] Beihang University,State Key Laboratory of Software Development Environment
[2] Arizona State University,School of Computing, Informatics, and Decision Systems Engineering
[3] Tsinghua University,Department of Computer Science and Technology, INLIST
来源
关键词
crowdsourcing; software engineering; competition rules; game theory;
D O I
暂无
中图分类号
学科分类号
摘要
Recently software crowdsourcing has become an emerging area of software engineering. Few papers have presented a systematic analysis on the practices of software crowdsourcing. This paper first presents an evaluation framework to evaluate software crowdsourcing projects with respect to software quality, costs, diversity of solutions, and competition nature in crowdsourcing. Specifically, competitions are evaluated by the min-max relationship from game theory among participants where one party tries to minimize an objective function while the other party tries to maximize the same objective function. The paper then defines a game theory model to analyze the primary factors in these minmax competition rules that affect the nature of participation as well as the software quality. Finally, using the proposed evaluation framework, this paper illustrates two crowdsourcing processes, Harvard-TopCoder and AppStori. The framework demonstrates the sharp contrasts between both crowdsourcing processes as participants will have drastic behaviors in engaging these two projects.
引用
收藏
页码:694 / 709
页数:15
相关论文
共 50 条
  • [1] An evaluation framework for software crowdsourcing
    Wenjun WU
    WeiTek TSAI
    Wei LI
    Frontiers of Computer Science, 2013, 7 (05) : 694 - 709
  • [2] An evaluation framework for software crowdsourcing
    Wu, Wenjun
    Tsai, Wei-Tek
    Li, Wei
    FRONTIERS OF COMPUTER SCIENCE, 2013, 7 (05) : 694 - 709
  • [3] A Developer Recommendation Framework in Software Crowdsourcing Development
    Shao, Wei
    Wang, Xiaoning
    Jiao, Wenpin
    SOFTWARE ENGINEERING AND METHODOLOGY FOR EMERGING DOMAINS, 2016, 675 : 151 - 164
  • [4] A framework for evaluation of crowdsourcing platforms performance
    Moghadasi, Mohammadhasan
    Shirmohammadi, Mehdi
    Ghasemi, Ahmadreza
    INFORMATION DEVELOPMENT, 2024, 40 (04) : 635 - 647
  • [5] A Learning to Rank Framework for Developer Recommendation in Software Crowdsourcing
    Zhu, Jiangang
    Shen, Beijun
    Hu, Fanghuai
    2015 22ND ASIA-PACIFIC SOFTWARE ENGINEERING CONFERENCE (APSEC 2015), 2015, : 285 - 292
  • [6] Evaluation of Software Quality in the TopCoder Crowdsourcing Environment
    Wang, Xin
    Wu, Wenjun
    Hu, Zhenghui
    2017 IEEE 7TH ANNUAL COMPUTING AND COMMUNICATION WORKSHOP AND CONFERENCE IEEE CCWC-2017, 2017,
  • [7] A Participant Recruitment Framework for Crowdsourcing based Software Requirement Acquisition
    Wang, Hao
    Wang, Yasha
    Wang, Jiangtao
    2014 IEEE 9TH INTERNATIONAL CONFERENCE ON GLOBAL SOFTWARE ENGINEERING (ICGSE), 2014, : 65 - 73
  • [8] Evaluation of Software Quality for Competition-based Software Crowdsourcing Projects
    Li, Boshu
    Wu, Wenjun
    Hu, Zhenhui
    PROCEEDINGS OF 2018 7TH INTERNATIONAL CONFERENCE ON SOFTWARE AND COMPUTER APPLICATIONS (ICSCA 2018), 2018, : 102 - 109
  • [9] Crowdsourcing Multimedia QoE Evaluation: A Trusted Framework
    Wu, Chen-Chi
    Chen, Kuan-Ta
    Chang, Yu-Chun
    Lei, Chin-Laung
    IEEE TRANSACTIONS ON MULTIMEDIA, 2013, 15 (05) : 1121 - 1137
  • [10] Blockchain-Empowered Decentralized Framework for Secure and Efficient Software Crowdsourcing
    Liu, Kang
    Chen, Wuhui
    Zhang, Zhen
    2020 IEEE WORLD CONGRESS ON SERVICES (SERVICES), 2020, : 128 - 133