The reproducibility of programming-related issues in Stack Overflow questions

被引:3
|
作者
Mondal, Saikat [1 ]
Rahman, Mohammad Masudur [2 ]
Roy, Chanchal K. [1 ]
Schneider, Kevin [1 ]
机构
[1] Univ Saskatchewan, Dept Comp Sci, Software Res Lab, Saskatoon, SK, Canada
[2] Dalhousie Univ, Fac Comp Sci, Halifax, NS, Canada
基金
加拿大自然科学与工程研究理事会;
关键词
Issue reproducibility; Stack overflow; Code segments; Code level modifications; Reproducibility challenges; UNDERSTANDABILITY;
D O I
10.1007/s10664-021-10113-2
中图分类号
TP31 [计算机软件];
学科分类号
081202 ; 0835 ;
摘要
Software developers often look for solutions to their code-level problems using the Stack Overflow Q&A website. To receive help, developers frequently submit questions that contain sample code segments along with the description of the programming issue. Unfortunately, it is not always possible to reproduce the issues from the code segments they provide. Issues that are not easily reproducible may impede questions from receiving prompt and appropriate solutions. We conducted an exploratory study on the reproducibility of issues discussed in 400 Java and 400 Python questions. We parsed, compiled, executed, and carefully examined the code segments from these questions to reproduce the reported programming issues, expending 300 person-hours of effort. The outcomes of our study are three-fold. First, we can reproduce the issues for approximately 68% of Java and 71% of Python code segments. In contrast, we were unable to reproduce approximately 22% of Java and 19% of Python issues. Of the reproducible issues, approximately 67% of the Java and 20% of the Python code segments required minor or major modifications to reproduce the issues. Second, we carefully investigated why programming issues could not be reproduced and provided evidence-based guidelines to write effective code examples for Stack Overflow questions. Third, we investigated the correlation between the issue reproducibility status of questions and the corresponding answer meta-data, such as the presence of an accepted answer. According to our analysis, a reproducible question has at least two times higher chance of receiving an accepted answer than an irreproducible question. Besides, the median time delay in receiving accepted answers is double if the issues reported in questions could not be reproduced. We also investigated the confounding factors (e.g., user reputation) that can affect questions receiving answers besides reproducibility. We found that such factors do not hurt the correlation between reproducibility status and answer meta-data.
引用
收藏
页数:52
相关论文
共 50 条
  • [31] An Exploratory Analysis of Mobile Development Issues using Stack Overflow
    Linares-Vasquez, Mario
    Dit, Bogdan
    Poshyvanyk, Denys
    2013 10TH IEEE WORKING CONFERENCE ON MINING SOFTWARE REPOSITORIES (MSR), 2013, : 93 - 96
  • [32] Detecting Duplicate Questions in Stack Overflow via Deep Learning Approaches
    Wang, Liting
    Zhang, Li
    Jiang, Jing
    2019 26TH ASIA-PACIFIC SOFTWARE ENGINEERING CONFERENCE (APSEC), 2019, : 506 - 513
  • [33] Detecting Duplicate Questions in Stack Overflow via Semantic and Relevance Approaches
    Liao, Zhifang
    Li, Wenlong
    Zhang, Yan
    Yu, Song
    2021 28TH ASIA-PACIFIC SOFTWARE ENGINEERING CONFERENCE (APSEC 2021), 2021, : 111 - 119
  • [34] A Manual Categorization of Android App Development Issues on Stack Overflow
    Beyer, Stefanie
    Pinzger, Martin
    2014 IEEE INTERNATIONAL CONFERENCE ON SOFTWARE MAINTENANCE AND EVOLUTION (ICSME), 2014, : 531 - 535
  • [35] Chaff from the Wheat : Characterization and Modeling of Deleted Questions on Stack Overflow
    Correa, Denzil
    Sureka, Ashish
    WWW'14: PROCEEDINGS OF THE 23RD INTERNATIONAL CONFERENCE ON WORLD WIDE WEB, 2014, : 631 - 641
  • [36] Detecting Duplicate Questions in Stack Overflow via Source Code Modeling
    Gao, Wei
    Wu, Jian
    Xu, Guandong
    INTERNATIONAL JOURNAL OF SOFTWARE ENGINEERING AND KNOWLEDGE ENGINEERING, 2022, 32 (02) : 227 - 255
  • [37] An Observational Study on React Native (RN) Questions on Stack Overflow (SO)
    Albesher, Luluh
    Aldossari, Razan
    Alfayez, Reem
    IET SOFTWARE, 2023, 2023
  • [38] Learning from Gurus: Analysis and Modeling of Reopened Questions on Stack Overflow
    Gupta, Rishabh
    Reddy, P. Krishna
    PROCEEDINGS OF THE THIRD ACM IKDD CONFERENCE ON DATA SCIENCES (CODS), 2016,
  • [39] A Discriminative Model Approach for Suggesting Tags Automatically for Stack Overflow Questions
    Saha, Avigit K.
    Saha, Ripon K.
    Schneider, Kevin A.
    2013 10TH IEEE WORKING CONFERENCE ON MINING SOFTWARE REPOSITORIES (MSR), 2013, : 73 - 76
  • [40] Evaluating Privacy Questions From Stack Overflow: Can ChatGPT Compete?
    Defile, Zack
    Radel, Sean
    Godinez, Joe
    Engstrom, Garrett
    Brucker, Theo
    Young, Kenzie
    Ghanavati, Scpideh
    2023 IEEE 31ST INTERNATIONAL REQUIREMENTS ENGINEERING CONFERENCE WORKSHOPS, REW, 2023, : 239 - 244