The Effect of Using Automated Essay Evaluation on ESL Undergraduate Students' Writing Skill

被引:13
|
作者
Aluthman, Ebtisam S. [1 ]
机构
[1] Princess Norah Bint Abdulruhman Univ, Dept Appl Linguist, Riyadh, Saudi Arabia
关键词
Automated Essay Evaluation; academic writing; language Assessment; Saudi ESL undergraduate students;
D O I
10.5539/ijel.v6n5p54
中图分类号
H0 [语言学];
学科分类号
030303 ; 0501 ; 050102 ;
摘要
Advances in Natural Language Processing (NLP) have yielded significant advances in the language assessment field. The Automated Essay Evaluation (AEE) mechanism relies on basic research in computational linguistics focusing on transforming human language into algorithmic forms. The Criterion (R) system is an instance of AEE software providing both formative feedback and an automated holistic score. This paper aims to investigate the impact of this newly-developed AEE software in a current ESL setting by measuring the effectiveness of the Criterion (R) system in improving ESL undergraduate students' writing performance. Data was collected from sixty-one ESL undergraduate students in an academic writing course in the English Language department at Princess Norah bint Abdulruhman University PNU. The researcher employed a repeated measure design study to test the potential effects of the formative feedback and automated holistic score on overall writing proficiency across time. Results indicated that the Criterion (R) system had a positive effect on the students' cores on their writing tasks. However, results also suggested that students' mechanics in writing significantly improved, while grammar, usage and style showed only moderate improvement. These findings are discussed in relation to AEE literature. The paper concludes by discussing the implications of implementing AEE software in educational contexts.
引用
收藏
页码:54 / 67
页数:14
相关论文
共 50 条