Crowdsourcing the Evaluation of Multiple-Choice Questions Using Item-Writing Flaws and Bloom's Taxonomy

被引:2
|
作者
Moore, Steven [1 ]
Fang, Ellen [1 ]
Nguyen, Huy A. [1 ]
Stamper, John [1 ]
机构
[1] Carnegie Mellon Univ, Human Comp Interact, Pittsburgh, PA 15213 USA
关键词
crowdsourcing; learnersourcing; question evaluation; question quality; question generation; QUALITY;
D O I
10.1145/3573051.3593396
中图分类号
TP39 [计算机的应用];
学科分类号
081203 ; 0835 ;
摘要
Multiple-choice questions, which are widely used in educational assessments, have the potential to negatively impact student learning and skew analytics when they contain item-writing flaws. Existing methods for evaluating multiple-choice questions in educational contexts tend to focus primarily on machine readability metrics, such as grammar, syntax, and formatting, without considering the intended use of the questions within course materials and their pedagogical implications. In this study, we present the results of crowdsourcing the evaluation of multiple-choice questions based on 15 common item-writing flaws. Through analysis of 80 crowdsourced evaluations on questions from the domains of calculus and chemistry, we found that crowdworkers were able to accurately evaluate the questions, matching 75% of the expert evaluations across multiple questions. They were able to correctly distinguish between two levels of Bloom's Taxonomy for the calculus questions, but were less accurate for chemistry questions. We discuss how to scale this question evaluation process and the implications it has across other domains. This work demonstrates how crowdworkers can be leveraged in the quality evaluation of educational questions, regardless of prior experience or domain knowledge.
引用
收藏
页码:25 / 34
页数:10
相关论文
共 50 条
  • [1] Assessment of Item-Writing Flaws in Multiple-Choice Questions
    Nedeau-Cayo, Rosemarie
    Laughlin, Deborah
    Rus, Linda
    Hall, John
    JOURNAL FOR NURSES IN PROFESSIONAL DEVELOPMENT, 2013, 29 (02) : 52 - 57
  • [2] Impact of item-writing flaws in multiple-choice questions on student achievement in high-stakes nursing assessments
    Tarrant, Marie
    Ware, James
    MEDICAL EDUCATION, 2008, 42 (02) : 198 - 206
  • [3] ASSESSING INTER-RATER AGREEMENT ABOUT ITEM-WRITING FLAWS IN MULTIPLE-CHOICE QUESTIONS OF CLINICAL ANATOMY
    Guimaraes, B.
    Pais, J.
    Coelho, E.
    Silva, A.
    Povo, A.
    Lourinho, I.
    Severo, M.
    Ferreira, M. A.
    EDULEARN13: 5TH INTERNATIONAL CONFERENCE ON EDUCATION AND NEW LEARNING TECHNOLOGIES, 2013, : 5921 - 5924
  • [4] Evaluation of five guidelines for option development in multiple-choice item-writing
    Martinez, Rafael J.
    Moreno, Rafael
    Martin, Irene
    Trigo, M. Eva
    PSICOTHEMA, 2009, 21 (02) : 326 - 330
  • [5] A review of multiple-choice item-writing guidelines for classroom assessment
    Haladyna, TM
    Downing, SM
    Rodriguez, MC
    APPLIED MEASUREMENT IN EDUCATION, 2002, 15 (03) : 309 - 334
  • [6] ADAPTING MULTIPLE-CHOICE ITEM-WRITING GUIDELINES TO AN INDUSTRIAL CONTEXT
    Foster, Robert Michael
    ICEIS 2010: PROCEEDINGS OF THE 12TH INTERNATIONAL CONFERENCE ON ENTERPRISE INFORMATION SYSTEMS, VOL 4: SOFTWARE AGENTS AND INTERNET COMPUTING, 2010, : 71 - 74
  • [7] The frequency of item writing flaws in multiple-choice questions used in high stakes nursing assessments
    Tarrant, Marie
    Knierim, Aimee
    Hayes, Sasha K.
    Ware, James
    NURSE EDUCATION TODAY, 2006, 26 (08) : 662 - 671
  • [8] Incorporation of Bloom's Taxonomy into Multiple-Choice Examination Questions for a Pharmacotherapeutics Course
    Kim, Myo-Kyoung
    Patel, Rajul A.
    Uchizono, James A.
    Beck, Lynn
    AMERICAN JOURNAL OF PHARMACEUTICAL EDUCATION, 2012, 76 (06)
  • [9] Knowledge of dental faculty in gulf cooperation council states of multiple-choice questions' item writing flaws
    Kowash, Mawlood
    Alhobeira, Hazza
    Hussein, Iyad
    Al Halabi, Manal
    Khan, Saif
    MEDICAL EDUCATION ONLINE, 2020, 25 (01):
  • [10] Development and use of a multiple-choice item writing flaws evaluation instrument in the context of general chemistry
    Breakall, Jared
    Randles, Christopher
    Tasker, Roy
    CHEMISTRY EDUCATION RESEARCH AND PRACTICE, 2019, 20 (02) : 369 - 382