Scoring Rubric for Grading Students' Participation in Online Discussions

被引:6
|
作者
Lunney, Margaret [1 ]
Sammarco, Angela [1 ]
机构
[1] CUNY Coll Staten Isl, Dept Nursing, Staten Isl, NY 10314 USA
关键词
Grading; Interrater reliability; Online education; Scoring rubric; Students' online discussions;
D O I
10.1097/NCN.0b013e31818dd3f6
中图分类号
TP39 [计算机的应用];
学科分类号
081203 ; 0835 ;
摘要
Required discussions of course readings provide motivation for students to learn course content and can be used to validate students' comprehension of course content and processes. A tool for grading the students' weekly discussions of course work was tested to evaluate the interrater reliability of grading by two faculty members. The purpose of this article is to describe psychometric testing of the interrater reliability of this grading method. Using the grading tool, independent ratings of five students' online discussion postings were recorded by both faculty members over a 5-week period, which provided the data for this study. Data were analyzed using Spearman p and Kendall tau-b statistics. The findings revealed that the overall correlations of rater scores were satisfactory and indicated that an acceptable level of interrater reliability was obtained through use of the grading tool. Reliable tools for evaluation of students' online discussions contribute to the knowledge needed for the implementation of online courses.
引用
收藏
页码:26 / 31
页数:6
相关论文
共 50 条