Crowdsourcing the Evaluation of Multiple-Choice Questions Using Item-Writing Flaws and Bloom's Taxonomy

被引:2
|
作者
Moore, Steven [1 ]
Fang, Ellen [1 ]
Nguyen, Huy A. [1 ]
Stamper, John [1 ]
机构
[1] Carnegie Mellon Univ, Human Comp Interact, Pittsburgh, PA 15213 USA
关键词
crowdsourcing; learnersourcing; question evaluation; question quality; question generation; QUALITY;
D O I
10.1145/3573051.3593396
中图分类号
TP39 [计算机的应用];
学科分类号
081203 ; 0835 ;
摘要
Multiple-choice questions, which are widely used in educational assessments, have the potential to negatively impact student learning and skew analytics when they contain item-writing flaws. Existing methods for evaluating multiple-choice questions in educational contexts tend to focus primarily on machine readability metrics, such as grammar, syntax, and formatting, without considering the intended use of the questions within course materials and their pedagogical implications. In this study, we present the results of crowdsourcing the evaluation of multiple-choice questions based on 15 common item-writing flaws. Through analysis of 80 crowdsourced evaluations on questions from the domains of calculus and chemistry, we found that crowdworkers were able to accurately evaluate the questions, matching 75% of the expert evaluations across multiple questions. They were able to correctly distinguish between two levels of Bloom's Taxonomy for the calculus questions, but were less accurate for chemistry questions. We discuss how to scale this question evaluation process and the implications it has across other domains. This work demonstrates how crowdworkers can be leveraged in the quality evaluation of educational questions, regardless of prior experience or domain knowledge.
引用
收藏
页码:25 / 34
页数:10
相关论文
共 50 条
  • [11] Writing Multiple-Choice Questions
    Boland, Robert J.
    Lester, Natalie A.
    Williams, Eric
    ACADEMIC PSYCHIATRY, 2010, 34 (04) : 310 - 316
  • [12] Construct-irrelevant variance and flawed test questions: Do multiple-choice item-writing principles make any difference?
    Downing, SM
    ACADEMIC MEDICINE, 2002, 77 (10) : S103 - S104
  • [13] Pushing Critical Thinking Skills With Multiple-Choice Questions: Does Bloom's Taxonomy Work?
    Zaidi, Nikki L. Bibler
    Grob, Karri L.
    Monrad, Seetha M.
    Kurtz, Joshua B.
    Tai, Andrew
    Ahmed, Asra Z.
    Gruppen, Larry D.
    Santen, Sally A.
    ACADEMIC MEDICINE, 2018, 93 (06) : 856 - 859
  • [14] What faculty write versus what students see? Perspectives on multiple-choice questions using Bloom's taxonomy
    Monrad, Seetha U.
    Zaidi, Nikki L. Bibler
    Grob, Karri L.
    Kurtz, Joshua B.
    Tai, Andrew W.
    Hortsch, Michael
    Gruppen, Larry D.
    Santen, Sally A.
    MEDICAL TEACHER, 2021, 43 (05) : 575 - 582
  • [15] Examining Bloom’s Taxonomy in Multiple Choice Questions: Students’ Approach to Questions
    J. K. Stringer
    Sally A. Santen
    Eun Lee
    Meagan Rawls
    Jean Bailey
    Alicia Richards
    Robert A. Perera
    Diane Biskobing
    Medical Science Educator, 2021, 31 : 1311 - 1317
  • [16] Examining Bloom's Taxonomy in Multiple Choice Questions: Students' Approach to Questions
    Stringer, J. K.
    Santen, Sally A.
    Lee, Eun
    Rawls, Meagan
    Bailey, Jean
    Richards, Alicia
    Perera, Robert A.
    Biskobing, Diane
    MEDICAL SCIENCE EDUCATOR, 2021, 31 (04) : 1311 - 1317
  • [17] Will a Short Training Session Improve Multiple-Choice Item-Writing Quality by Dental School Faculty? A Pilot Study
    Dellinges, Mark A.
    Curtis, Donald A.
    JOURNAL OF DENTAL EDUCATION, 2017, 81 (08) : 948 - 955
  • [18] Assessing cognitive skills of pharmacy students in a biomedical sciences module using a classification of multiple-choice item categories according to bloom's taxonomy
    Knecht, KT
    AMERICAN JOURNAL OF PHARMACEUTICAL EDUCATION, 2001, 65 (04) : 324 - 334
  • [19] Using Automatic Item Generation to Create Multiple-Choice Questions for Pharmacy Assessment
    Leslie, Tara
    Gierl, Mark J.
    AMERICAN JOURNAL OF PHARMACEUTICAL EDUCATION, 2023, 87 (10)
  • [20] GUIDELINES FOR WRITING MULTIPLE-CHOICE QUESTIONS IN RADIOLOGY COURSES
    VYDARENY, KH
    BLANE, CE
    CALHOUN, JG
    INVESTIGATIVE RADIOLOGY, 1986, 21 (11) : 871 - 876