Designing a comprehensive rubric for laboratory report assessment

Assessment moderation processes play a vital role in maintaining quality assurance for university courses. These processes ensure that the assessment is consistent, reproducible and transparent. They also assure students that their work is assessed with fairness and addresses the stated learning out...

Full description

Bibliographic Details
Main Authors: Siddiqui, Salim, Loss, Robert, Hotan, Aidan, Lim, Ming, Zadnik, Marjan
Other Authors: Curtin University
Format: Conference Paper
Published: Curtin University 2010
Online Access:http://hdl.handle.net/20.500.11937/28112
_version_ 1848752448052658176
author Siddiqui, Salim
Loss, Robert
Hotan, Aidan
Lim, Ming
Zadnik, Marjan
author2 Curtin University
author_facet Curtin University
Siddiqui, Salim
Loss, Robert
Hotan, Aidan
Lim, Ming
Zadnik, Marjan
author_sort Siddiqui, Salim
building Curtin Institutional Repository
collection Online Access
description Assessment moderation processes play a vital role in maintaining quality assurance for university courses. These processes ensure that the assessment is consistent, reproducible and transparent. They also assure students that their work is assessed with fairness and addresses the stated learning outcomes. In line with Curtin's Assessment & Moderation Policy, we applied a moderation process to first-year science enabling units. One of the major assessment components of these units is the laboratory work, which involves taking a wide range of measurements of physical quantities with due regard to measurement uncertainties, analysing the data, calculating the results and interpreting the results. The students then present their work in a formal scientifically written report to their laboratory demonstrator for assessment. The students' reports are assessed using a specific rubric which is available to students and the demonstrators through Blackboard at the beginning of the semester. To gauge any variations in marking, eight demonstrators and two staff members were provided with a set of six de-identified laboratory reports for marking using the current rubric. The results obtained showed that the percentage standard deviation of all the demonstrators varied from 18% to 42% from the mean value. We believe this may be due to a wide range of demonstrators' experience and background knowledge and also whether they have completed the annually run Curtin's Laboratory Demonstrators' Workshop. In consultation with the Office of the Dean of Teaching and Learning, the current rubric was re-designed to show a further breakdown of marks for future use.Following discussion with demonstrators and staff the re-designed rubric was accepted with some modifications. To check the validity and reliability of the new rubric, another set of six reports were marked by the same assessors. In this presentation we will discuss the results of the current and the modified rubric.
first_indexed 2025-11-14T08:08:47Z
format Conference Paper
id curtin-20.500.11937-28112
institution Curtin University Malaysia
institution_category Local University
last_indexed 2025-11-14T08:08:47Z
publishDate 2010
publisher Curtin University
recordtype eprints
repository_type Digital Repository
spelling curtin-20.500.11937-281122018-12-14T00:52:50Z Designing a comprehensive rubric for laboratory report assessment Siddiqui, Salim Loss, Robert Hotan, Aidan Lim, Ming Zadnik, Marjan Curtin University Assessment moderation processes play a vital role in maintaining quality assurance for university courses. These processes ensure that the assessment is consistent, reproducible and transparent. They also assure students that their work is assessed with fairness and addresses the stated learning outcomes. In line with Curtin's Assessment & Moderation Policy, we applied a moderation process to first-year science enabling units. One of the major assessment components of these units is the laboratory work, which involves taking a wide range of measurements of physical quantities with due regard to measurement uncertainties, analysing the data, calculating the results and interpreting the results. The students then present their work in a formal scientifically written report to their laboratory demonstrator for assessment. The students' reports are assessed using a specific rubric which is available to students and the demonstrators through Blackboard at the beginning of the semester. To gauge any variations in marking, eight demonstrators and two staff members were provided with a set of six de-identified laboratory reports for marking using the current rubric. The results obtained showed that the percentage standard deviation of all the demonstrators varied from 18% to 42% from the mean value. We believe this may be due to a wide range of demonstrators' experience and background knowledge and also whether they have completed the annually run Curtin's Laboratory Demonstrators' Workshop. In consultation with the Office of the Dean of Teaching and Learning, the current rubric was re-designed to show a further breakdown of marks for future use.Following discussion with demonstrators and staff the re-designed rubric was accepted with some modifications. To check the validity and reliability of the new rubric, another set of six reports were marked by the same assessors. In this presentation we will discuss the results of the current and the modified rubric. 2010 Conference Paper http://hdl.handle.net/20.500.11937/28112 Curtin University fulltext
spellingShingle Siddiqui, Salim
Loss, Robert
Hotan, Aidan
Lim, Ming
Zadnik, Marjan
Designing a comprehensive rubric for laboratory report assessment
title Designing a comprehensive rubric for laboratory report assessment
title_full Designing a comprehensive rubric for laboratory report assessment
title_fullStr Designing a comprehensive rubric for laboratory report assessment
title_full_unstemmed Designing a comprehensive rubric for laboratory report assessment
title_short Designing a comprehensive rubric for laboratory report assessment
title_sort designing a comprehensive rubric for laboratory report assessment
url http://hdl.handle.net/20.500.11937/28112