Using on-line examiner training to improve inter-rater reliability

Malau-Aduli, B.S., Mulcahy, S., Warnecke, E., and Turner, R. (2012) Using on-line examiner training to improve inter-rater reliability. In: Abstracts from 15th Ottawa Conference on the Assessment of Competence in Medicine and the Healthcare Professions. 8B2. p. 133. From: 15th Ottawa Conference on the Assessment of Competence in Medicine and the Healthcare Professions, 9-13 March 2012, Kuala Lumpur, Malaysia.

PDF (Published Version) - Published Version
Download (661kB) | Preview
View at Publisher Website:


Background: A crucial determinant of reliability in OSCE is the accurate judgment made by the examiners, particularly if the pass/fail decision is made by a single examiner. Attempts to improve the reliability of OSCEs include organising training sessions to allow examiners carry out their role consistently thereby reducing variability in scoring. However, due to the busy schedule of clinicians and the challenges of getting away from their activities to attend examiner-training sessions, the validity and reliability of the examinations could be compromised.

Description: An OSCE collaboration project was developed between two Australian Universities in 2010 in which three OSCE stations were developed and embedded in the first clinical phase examinations at both schools. To reduce variability in scoring, an OSCE e-scoring tool was developed and set up in a secure on-line Blackboard Learning System Vista environment. The three shared OSCE scenarios were videotaped and used for on-line examiner training. All internal and external examiners were invited one week prior to the examination via email, given login access and instructions on how to use the program. In their own time, each examiner was able to watch and assess two unlabelled scenarios (poor and good performance) of the OSCE case which they had been assigned to examine on. After completing and submitting their scoring sheets, the examiners were able to compare scores, reflect on their judgments and discuss their decisions on-line. They were also asked to provide feedback on their experiences of the e-scoring program, using an on-line survey.

Results: The e-scoring package gave the examiners the opportunity to standardise their marking by comparing their scores with their co-examiners and reaching consensus on scoring techniques. Similar trends in the results were observed at both schools with high inter-rater reliability, especially with the global scores. Examiners valued the process as it allowed them to set the 'expected standard' for the station prior to the actual exam. They also indicated that this sort of tool should be used more widely in OSCEs.

Applicability: The observed close agreement between examiner scores in this study, despite the different geographical locations is attributed to the e-scoring program because it offered training exercise for both quality assurance and appraisal purposes. The efficacy and ease of use of this novel approach to examiner training indicate the possibility of its wider use in OSCEs.

Adaptability: The importance of the commitment of medical educators to the quality assurance of OSCEs cannot be overemphasised. Results from this study revealed that the e-scoring program has the potential to enhance inter-rater reliability in OSCEs. With increasing student numbers, numerous teaching sites within each medical school as well as time constraints, this tool will afford time-poor clinicians the opportunity to better engage with the assessment process and reach consensus on their scoring techniques, thereby providing validity evidence to all stakeholders.

Item ID: 39525
Item Type: Conference Item (Presentation)
Related URLs:
Date Deposited: 28 Jul 2015 01:37
FoR Codes: 11 MEDICAL AND HEALTH SCIENCES > 1199 Other Medical and Health Sciences > 119999 Medical and Health Sciences not elsewhere classified @ 50%
13 EDUCATION > 1301 Education Systems > 130103 Higher Education @ 50%
SEO Codes: 92 HEALTH > 9299 Other Health > 929999 Health not elsewhere classified @ 60%
93 EDUCATION AND TRAINING > 9302 Teaching and Instruction > 930203 Teaching and Instruction Technologies @ 20%
93 EDUCATION AND TRAINING > 9302 Teaching and Instruction > 930202 Teacher and Instructor Development @ 20%
Downloads: Total: 100
Last 12 Months: 2
More Statistics

Actions (Repository Staff Only)

Item Control Page Item Control Page