Knowledge Assessment: Squeezing Information From Multiple-Choice Testing

被引:2
|
作者
Nickerson, Raymond S. [1 ]
Butler, Susan F. [1 ]
Carlin, Michael T. [2 ]
机构
[1] Tufts Univ, Dept Psychol, Medford, MA 02155 USA
[2] Rider Univ, Dept Psychol, Lawrenceville, NJ 08648 USA
关键词
knowledge assessment; multiple-choice testing; confidence assessment; STRATEGIC REGULATION; DONT KNOW; PERSONAL PROBABILITIES; IMMEDIATE FEEDBACK; NUMBER-RIGHT; TEST-SCORES; VALIDITY; MEMORY; DISTRACTORS; RELIABILITY;
D O I
10.1037/xap0000041
中图分类号
B849 [应用心理学];
学科分类号
040203 ;
摘要
Knowledge assessment via testing can be viewed from two vantage points: that of the test administrator and that of the test taker. From the administrator's perspective, the objective is to discover what an individual knows about a domain of interest. From that of the test taker, the challenge is to reveal what one knows. In this article we describe a procedure for administering and scoring multiple-choice tests that satisfies both of these objectives and we present experimental data that demonstrate its effectiveness. The method allows test takers to provide specific information about their confidence that each alternative for an item is the correct answer and makes guessing not only unnecessary but detrimental. From this information the administrator can derive measures of both knowledge and confidence, which, we argue, provides better estimates than systems that do not allow measurement of partial knowledge. The use of such measures for purposes of evaluation both of individual test takers' knowledge of a subject of interest and of the effectiveness of instruction with respect to that subject is discussed.
引用
收藏
页码:167 / 177
页数:11
相关论文
共 50 条
  • [31] Multiple-Choice Testing in Education: Are the Best Practices for Assessment Also Good for Learning?
    Butler, Andrew C.
    JOURNAL OF APPLIED RESEARCH IN MEMORY AND COGNITION, 2018, 7 (03) : 323 - 331
  • [32] Statewide performance assessment as a complement to multiple-choice testing in high school economics
    Harris, RB
    Kerby, WC
    JOURNAL OF ECONOMIC EDUCATION, 1997, 28 (02): : 122 - &
  • [33] Automated Generation and Tagging of Knowledge Components from Multiple-Choice Questions
    Moore, Steven
    Schmucker, Robin
    Mitchell, Tom
    Stamper, John
    PROCEEDINGS OF THE ELEVENTH ACM CONFERENCE ON LEARNING@SCALE, L@S 2024, 2024, : 122 - 133
  • [34] Nonparametric cognitive diagnostic computerized adaptive testing using multiple-choice option information
    Sun Xiaojian
    Guo Lei
    ACTA PSYCHOLOGICA SINICA, 2022, 54 (09) : 1137 - 1150
  • [35] The Testing Effect in University Teaching: Using Multiple-Choice Testing to Promote Retention of Highly Retrievable Information
    Greving, Sven
    Lenhard, Wolfgang
    Richter, Tobias
    TEACHING OF PSYCHOLOGY, 2023, 50 (04) : 332 - 341
  • [36] Diagnostic Assessment With Ordered Multiple-Choice Items
    Briggs, Derek C.
    Alonzo, Alicia C.
    Schwab, Cheryl
    Wilson, Mark
    EDUCATIONAL ASSESSMENT, 2006, 11 (01) : 33 - 63
  • [37] ASSESSMENT IN MEDICINE - MULTIPLE-CHOICE QUESTION CONTROVERSY
    ANDERSON, J
    MEDICAL TEACHER, 1979, 1 (06) : 303 - 303
  • [38] MULTIPLE-CHOICE
    LEU, AJ
    VASA-JOURNAL OF VASCULAR DISEASES, 1984, 13 (01): : 3 - 3
  • [39] MULTIPLE-CHOICE
    RODDIE, IC
    LANCET, 1973, 2 (7823): : 258 - 258
  • [40] MULTIPLE-CHOICE
    GRAY, J
    LANCET, 1973, 2 (7825): : 386 - 387