Automated essay assessment: an evaluation on paperrater’s reliability from practice / Nguyen Vi Thong

From a perspective of a PaperRater user, the author attempts to investigate the reliability of the program. Twenty-four freshman students and one writing teacher at Dalat University - Vietnam were recruited to serve the study. The author also served as one scorer. The scores generated by Paper...

Full description

Bibliographic Details
Main Author: Nguyen, Vi Thong
Format: Article
Language:English
Published: Universiti Teknologi MARA, Kedah 2017
Subjects:
Online Access:https://ir.uitm.edu.my/id/eprint/30281/
_version_ 1848807455203524608
author Nguyen, Vi Thong
author_facet Nguyen, Vi Thong
author_sort Nguyen, Vi Thong
building UiTM Institutional Repository
collection Online Access
description From a perspective of a PaperRater user, the author attempts to investigate the reliability of the program. Twenty-four freshman students and one writing teacher at Dalat University - Vietnam were recruited to serve the study. The author also served as one scorer. The scores generated by PaperRater and the two human scorers were analyzed quantitatively and qualitatively. The statistical results indicate that there is an excellent correlation between the means of scores generated by three scorers. With the aid of SPSS and certain calculation, it is shown that PaterRater has an acceptable reliability which implies that the program can somehow assist in grading students’ papers. The semi-structured interview at the qualitative stage with the teacher scorer helped point out several challenges that writing teachers might encounter when assessing students’ prompts. From her perspective, it was admitted that with the assistance of PaperRater, the burden of assessing a bunch of prompts at a short time period would be much released. However, how the program can be employed by teachers should be carefully investigated. Therefore, this study provides writing teachers with pedagogical implications on how PaperRater should be used in writing classrooms. The study is expected to shed new light on the possibility of adopting an automated evaluation instrument as a scoring assistant in large writing classrooms.
first_indexed 2025-11-14T22:43:05Z
format Article
id uitm-30281
institution Universiti Teknologi MARA
institution_category Local University
language English
last_indexed 2025-11-14T22:43:05Z
publishDate 2017
publisher Universiti Teknologi MARA, Kedah
recordtype eprints
repository_type Digital Repository
spelling uitm-302812020-05-04T03:07:08Z https://ir.uitm.edu.my/id/eprint/30281/ Automated essay assessment: an evaluation on paperrater’s reliability from practice / Nguyen Vi Thong cpit Nguyen, Vi Thong Computers in education. Information technology Systems of individual educators and writers From a perspective of a PaperRater user, the author attempts to investigate the reliability of the program. Twenty-four freshman students and one writing teacher at Dalat University - Vietnam were recruited to serve the study. The author also served as one scorer. The scores generated by PaperRater and the two human scorers were analyzed quantitatively and qualitatively. The statistical results indicate that there is an excellent correlation between the means of scores generated by three scorers. With the aid of SPSS and certain calculation, it is shown that PaterRater has an acceptable reliability which implies that the program can somehow assist in grading students’ papers. The semi-structured interview at the qualitative stage with the teacher scorer helped point out several challenges that writing teachers might encounter when assessing students’ prompts. From her perspective, it was admitted that with the assistance of PaperRater, the burden of assessing a bunch of prompts at a short time period would be much released. However, how the program can be employed by teachers should be carefully investigated. Therefore, this study provides writing teachers with pedagogical implications on how PaperRater should be used in writing classrooms. The study is expected to shed new light on the possibility of adopting an automated evaluation instrument as a scoring assistant in large writing classrooms. Universiti Teknologi MARA, Kedah 2017 Article PeerReviewed text en https://ir.uitm.edu.my/id/eprint/30281/1/AJ_NGUYEN%20VI%20THONG%20CPLT%20K%2017.pdf Nguyen, Vi Thong (2017) Automated essay assessment: an evaluation on paperrater’s reliability from practice / Nguyen Vi Thong. (2017) Journal of Creative Practices in Language Learning and Teaching (CPLT) <https://ir.uitm.edu.my/view/publication/Journal_of_Creative_Practices_in_Language_Learning_and_Teaching_=28CPLT=29.html>, 5 (1). pp. 1-18. ISSN 1823-464X https://cplt.uitm.edu.my/
spellingShingle Computers in education. Information technology
Systems of individual educators and writers
Nguyen, Vi Thong
Automated essay assessment: an evaluation on paperrater’s reliability from practice / Nguyen Vi Thong
title Automated essay assessment: an evaluation on paperrater’s reliability from practice / Nguyen Vi Thong
title_full Automated essay assessment: an evaluation on paperrater’s reliability from practice / Nguyen Vi Thong
title_fullStr Automated essay assessment: an evaluation on paperrater’s reliability from practice / Nguyen Vi Thong
title_full_unstemmed Automated essay assessment: an evaluation on paperrater’s reliability from practice / Nguyen Vi Thong
title_short Automated essay assessment: an evaluation on paperrater’s reliability from practice / Nguyen Vi Thong
title_sort automated essay assessment: an evaluation on paperrater’s reliability from practice / nguyen vi thong
topic Computers in education. Information technology
Systems of individual educators and writers
url https://ir.uitm.edu.my/id/eprint/30281/
https://ir.uitm.edu.my/id/eprint/30281/