Instructor-Written Hints as Automated Test Suite Quality Feedback

被引:0
|
作者
Perretta, James [1 ]
DeOrio, Andrew [2 ]
Guha, Arjun [1 ]
Bell, Jonathan [1 ]
机构
[1] Northeastern Univ, Boston, MA 02115 USA
[2] Univ Michigan, Ann Arbor, MI USA
关键词
mutation testing; software testing education; automated feedback;
D O I
暂无
中图分类号
TP39 [计算机的应用];
学科分类号
081203 ; 0835 ;
摘要
Mutation testing measures a test suite's ability to detect bugs by inserting bugs into the code and seeing if the tests behave differently. Mutation testing has recently seen increased adoption in industrial and open-source software but sees limited use in education. Some instructors use manually-constructed mutants to evaluate student tests and provide general automated feedback. Additional tutoring requires more intensive instructor interaction such as in office hours, which requires substantial resources at scale. Prior work suggests that students benefit from frequent, actionable feedback, and our work focuses on the challenge of leveraging automation to give students high-quality feedback when they need it. We deployed an automated hint system that provides instructor-written hints related to mutants that student-written tests do not detect. We evaluated our hint system in a controlled experiment across four assignments in two introductory programming courses, comprising 4,122 students. We also analyzed student test suite revisions and conducted a mixed-methods analysis of student hint ratings and comments collected by the automated hint system. We observed a small, statistically significant increase in the mean number of mutants detected by students who received hints (experiment group) compared to those who did not (control group). In 25% of instances where students received a hint, they detected the mutant in a single revision to their test suite. We conclude with recommendations based on our analysis as a starting point for instructors who wish to deploy this type of automated feedback.
引用
收藏
页码:910 / 916
页数:7
相关论文
共 50 条
  • [21] Video Feedback in Higher Education - A Contribution to Improving the Quality of Written Feedback
    Mathisen, Petter
    NORDIC JOURNAL OF DIGITAL LITERACY, 2012, 7 (02) : 97 - 116
  • [22] Improved quality and quantity of written feedback is associated with a structured feedback proforma
    Newton, Philip M.
    Wallace, Melisa J.
    McKimm, Judy
    JOURNAL OF EDUCATIONAL EVALUATION FOR HEALTH PROFESSIONS, 2012, 9
  • [23] Evolving the Quality of a Model Based Test Suite
    Farooq, Usman
    Lam, C. P.
    ICSTW 2009: IEEE INTERNATIONAL CONFERENCE ON SOFTWARE TESTING, VERIFICATION, AND VALIDATION WORKSHOPS, 2009, : 141 - 149
  • [24] Test Suite Quality for Model Transformation Chains
    Bauer, Eduard
    Kuster, Jochen M.
    Engels, Gregor
    OBJECTS, MODELS, COMPONENTS, PATTERNS, TOOLS 2011, 2011, 6705 : 3 - 19
  • [25] Test Suite Cooperative Framework on Software Quality
    Liu, Zhenyu
    Yang, Genxing
    Cai, Lizhi
    COOPERATIVE DESIGN, VISUALIZATION, AND ENGINEERING, PROCEEDINGS, 2009, 5738 : 289 - +
  • [26] Exploring the Effects of Automated Written Corrective Feedback on EFL Students' Writing Quality: A Mixed-Methods Study
    Fan, Ning
    SAGE OPEN, 2023, 13 (02):
  • [27] On guiding the augmentation of an automated test suite via mutation analysis
    Ben H. Smith
    Laurie Williams
    Empirical Software Engineering, 2009, 14 : 341 - 369
  • [28] Fully automated interoperability test suite derivation for communication protocols
    Seol, S
    Kim, M
    Kang, S
    Ryu, H
    COMPUTER NETWORKS, 2003, 43 (06) : 735 - 759
  • [29] Automated Scalable Test-suite Augmentation for Evolving Software
    Santelices, Raul
    2009 31ST INTERNATIONAL CONFERENCE ON SOFTWARE ENGINEERING, COMPANION VOLUME, 2009, : 379 - 382
  • [30] On guiding the augmentation of an automated test suite via mutation analysis
    Smith, Ben H.
    Williams, Laurie
    EMPIRICAL SOFTWARE ENGINEERING, 2009, 14 (03) : 341 - 369