Jisc case studies wiki Case studies / University of Edinburgh - e-Assessment in Medicine and Veterinary Medicine
  • If you are citizen of an European Union member nation, you may not use this service unless you are at least 16 years old.

  • You already know Dokkio is an AI-powered assistant to organize & manage your digital files & messages. Very soon, Dokkio will support Outlook as well as One Drive. Check it out today!

View
 

University of Edinburgh - e-Assessment in Medicine and Veterinary Medicine

Authors: Michael Begg (michael.begg@ed.ac.uk), David Dewhurst (d.dewhurst@ed.ac.uk)

JISC e-Learning Activity Area: e-assessment

Higher Education Academy Subject Centre: medecine dentistry and veterinary medicine

 

Case study tags: online learninguse of specialist softwarean effect on exam resultsan effect on student personal developmentinnovation in learning and teachingan influence on educational researchstaff satisfaction with e-learning,a positive effect on retentionan influence on policyuse of resourcesmodifications to learning spacesmanagement of learning assetstangible benefits of e-learning,university of edinburghe-assessment,medicine dentistry and veterinary medicine

 

Background & Context

 

The Online System for Clinical Assessment (OSCA) was developed in-house by the College of Medicine and Veterinary Medicine's Learning Technology Section to support the delivery of online OSCE assessments in the undergraduate medical degree programme (MBChB) and undergraduate veterinary degree programmes (BVM&S). OSCEs (Objective, Structured, Clinical Examinations) are an essential component of assessments for medical and veterinary students.

 

Why did you use this e-learning approach?

 

The system was developed in 2004/5 in order to specifically take over some of the stations of the year 3 clinical OSCE in the MBChB course which had previously been executed face-to-face at manned stations.

 

The drivers for adopting an online approach were:

 

  1. to reduce the possibility for student collusion
  2. to reduce the costs of running OSCEs which are extremely expensive
  3. to provide a system-based standard for marking
  4. to reduce the complex administration underpinning assessment processes
  5. to address the growing sense that existing assessment structures were not capturing as accurate a picture of student ability as was possible

 

What was the context in which you used this e-learning approach?

 

There are currently approximately 2000 students on the 2 undergraduate programmes in Medicine and Veterinary Medicine at Edinburgh.

 

Many of the 1500+ teaching and clinical staff are involved with assessment to some degree, though the figure is certainly smaller for those directly involved with authoring examinations within OSCA. There are approximately 40 teaching/admin staff registered as authors within the OSCA database though administrative staff often input assessments created by academic staff into the system on their behalf.

 

OSCA was initially used in the specific context of the Year 3 clinical exams of the MBChB programme. However, the system has, since 2005, expanded to accommodate other summative and in-course examinations in both the MBChB and the BVM&S. To date, OSCA has been used for 40 formal examinations - comprising events with around 100 simultaneous users to resit events with a single student. In the 2007-08 academic session OSCA will be the vehicle for 19 discrete assessment events across the 2 programmes (not including resits).

 

The institution is a licensed user of QuestionMark Perception (QMP), but the working group of academic staff, support staff and learning technologists tasked with evaluating the usefulness of QMP for OSCE examinations concluded that it lacked the flexibility to accommodate all of the assessment requirements of the relevant courses.

This was backed up by a comparative review of OSCA and QMP by a teaching fellow in 2006 where it was clearly established that the OSCA environment - both in terms of authoring and delivery - was largely more appropriate, usable, and popular with those staff creating assessments than QMP.

 

It should be noted, however, that one course within the MBChB continues to use QMP for some examinations.

 

What was the design?

 

The project was originally funded by the University of Edinburgh's Principal's eLearning Fund (PeLF), an annual fund which distributes seedcorn grants through a competitive bidding process.

 

Typically an OSCE clinical examination comprises around 20 OSCE stations and requires students to move from OSCE station to OSCE station each one providing a different clinical assessment and lasting approximately 5 minutes. Examination rooms required significant setting-up and each station needed to be supervised by one or more academic/clinical staff. A full cohort of students (230+ in Medicine alone) would pass through this examination making it extremely time (often extending over two days) and resource intensive. It was considered that some of the OSCE stations could be replaced by assessments, testing the same competencies/skills but delivered in a secure technological environment requiring no staff support that would allow a full cohort of students to be examined in two sittings over a single day.

 

The working group tasked with planning this project comprised course module leaders, clinical teaching staff, clinical skills facilitators, members of the Medical Teaching Organisation, College Office representatives (assessment officer) and representatives from the Learning Technology Section.

 

As the embedding of OSCA within the College's assessment processes has grown the management of the application has led to the establishment of an e-Assessment group comprising members of the Medical Teaching Organisation, the Veterinary Teaching Organisation, administrative staff, assessment officer and Learning Technology Section representatives. There is also a CMVM Teaching Fellow who undertakes ongoing review and evaluation of e-Assessment activity and steers much of the required staff development across the college.

 

How did you implement and embed this e-learning approach?

 

Alternatives were considered:

 

  • PowerPoint-delivered exam in a lecture theatre setting (visibility of slides and timing of slide rollover were prohibitive elements against uptake)
  • Paper based exam (poor quality reproduction of images, poor cost effectiveness, logistically problematic for marking)
  • MCQ format exam compatible with OSCE to derive a valid overall clinical assessment mark (C. van der Vleuten, Validity of final examinations in undergraduate medical training. BMJ 2000;321)

 

Another deciding factor explored by the initial project group was that the written/typed word does standardise the examination in a way that viva/consultation cannot.

 

An online solution - using an appropriately aligned system - offers a written word-based examination, automated marking, reduction in logistical problems and production costs.

 

With regard to practical implementation - a teaching fellow, closely liaising with learning technologists was able to steer content authors towards writing effective questions, with appropriate resources, that would be accommodated by the OSCA interface (see ongoing issue about question writing below).

 

Students were made familiar with the interface initially through a mock exam (which allowed for system testing) and subsequently through a video outlining the main interface features.

 

At the time of the original implementation one of the main concerns was the level of institutional support for online assessment. The level of support required from the central IT support departments had to be negotiated as essentially, with the increasing use of both QMP and OSCA for online examinations across the University they were/are being asked to support two different secure kiosk environments for assessment events.

 

Provision of physical spaces for conducting online assessment remains an issue for the Institution and work is being undertaken at institutional level to address this issue.

 

Technology Used

 

What technologies and/or e-tools were available to you?

 

The OSCA application is a combined authoring and delivery mechanism that extends an in-house developed web-based, content creation tool (EROS: Edinburgh Reusable Object Sequencer). EROS comprises a range of question-based templates which enable non-technical academic staff to create e-learning resources easily and quickly. The web-based nature of the application also allows for collaborative developments. OSCA essentially has added additional question types to the system which are designed specifically for OSCE clinical examinations.

 

As the application was built proximally - that is to say, in close collaboration with teachers, administrators and assessment officers, the question types and section - or, station - design reflects as accurately as possible the question and response expectations of the MVM programmes.

 

Tangible Benefits

 

What tangible benefits did this e-learning approach produce?

 

  • Evaluation of OSCA as an effective mechanism for delivering online assessment in the College has taken a number of forms. At assessment events themselves, external examiners question student groups as to how they found the exam. The examiners themselves have taken the time to effectively "sit" the exam themselves. The informally gathered, anecdotal feedback from these sessions has been wholly positive.
  • Formal questionnaires were completed by students via the integrated evaluation engine within the medical VLE, EEMeC and these contained questions relating to the OSCE assessments. The feedback from students suggests that the question-delivery interface is clear, the navigation through question sections is clear, and the range of question types and their presentation is also satisfactory. They, on the whole, perceive it to be an appropriate and "professional" way of conducting exams.
  • The use of virtual OSCE stations (as developed and delivered via OSCA) has reduced the number of physical OSCE stations and consequently the manpower needed to support them. Since OSCE examinations are auto-marked has also resulted in considerable savings in staff time. The facility to support grouped marking (the alignment of same and similar responses for grouped checking and marking) has also contributed to the decision to continue and evolve the use of OSCA.
  • The chances of student collusion outside the examination venue are significantly reduced as even large-scale events require only two back-to-back sittings.
  • With each event comes new questions which contribute to a growing bank of questions that can be drawn upon for future events. This should lead to more efficient creation of future questions, as authors can model new questions on tried and tested question types.

 

Did implementation of this e-learning approach have any disadvantages or drawbacks?

 

  • The system requires a significant amount of staff development work by learning technologists and teaching fellows in acquiring suitable resources for digital delivery, and composing of questions in ways that "make sense" at a system level.
  • The reliance upon limited physical space within the institution suitable for delivery of online assessment may soon lead to constraints upon how much can actually be delivered via the system.

 

How did this e-learning approach accord with or differ from any relevant departmental and/or institutional strategies?

 

Although begun as a College level resource - which solicited a degree of discord with the institutional level support for QMP - there is now a perceived acceptance not only of the College's need for OSCA, but that the application may be of use outwith the College. The possible friction that the dual systems may have given rise to have been in part abated by the efforts of the teaching fellow who, through the course of evaluating the systems in use within the College maintained good dialogue with central IT support bodies as well as local College staff. OSCA has certainly impacted on assessment policy in the College and if adopted more widely may impact on University assessment policy.

 

Lessons Learned

 

Summary and Reflection

 

OSCA has proven popular and effective as an engine for composing and delivering online assessments and is now used for online assessments beyond its original focus which was OSC clinical examinations. There have been considerable savings in resources used to deliver the examinations, mark the assessments and in creation of assessment questions. Staff and student satisfaction with the system is high and the development of the system has been a driver for staff development in those skills required to design effective high quality assessments. There has been some impact on College assessment policy. There is interest in implementing OSCA more widely across the University as it supports question-types not supported by QMP. One of the main constraints is having the physical facilities to deliver a growing number of online examinations.

 

Further Evidence

 

'The feedback from students suggests that the question-delivery interface is clear, the navigation through question sections is clear, and the range of question types and their presentation is also satisfactory. They, on the whole, perceive it to be an appropriate and 'professional' way of conducting exams.'

 

'virtual OSCE stations... has reduced the number of physical OSCE stations and consequently the manpower needed to support them. Since OSCE examinations are auto-marked has also resulted in considerable savings in staff time.'

 

'the chances of student collusion... are significantly reduced'

 

'new questions which contribute to a growing bank of questions that can be drawn upon...'