| Summary: | Introduction: The assessment of students’ skill perfor-mance is an often high- stakes task that is frequently reliant on expert examiners’ judgements. However, judgements are subject to bias, examiners may ‘fail to fail’ underper-forming students in person, and expert judgements provide little assistance for students developing their own evalua-tive judgement.Objectives: With occupational therapy students’ interview skills as the focus, this study: (i) interrogated the design of tools for the formative and summative evaluation of stu-dents’ skill performance, and (ii) created a tool to support actionable formative feedback, robust summative assess-ment, and shared understanding of qualitative performance characteristics.Method: In a reflexive action research cycle, we re- designed an interview skills checklist into a qualitative rubric using prior research, empirical data, shared experience, and a re-corded examiner consultation and practice session. We implemented the tools formatively for self, peer and/or exam-iner feedback in simulation programs across two universities, and evaluated a summative rubric in otherwise equivalent viva examinations with successive cohorts at one institution.Results: A rubric richly describing levels of performance in one cohort (n = 249) vastly improved the measure-ment of the quality of performance in a subsequent cohort (n=235) compared with a skills checklist and examiner judgement plus the Objective Borderline Method. The tools demonstrated utility in supporting students’ self and peer evaluation.Conclusion: Taking a novel approach to rubric design involving markedly shifting the presentation of performance levels refocussed the task from one of recording a judgement to one of evaluating a performance against commonly agreed criteria.
|