Jisc case studies wiki Case studies / The University of Warwick - Use of e-assessment in medicine
  • If you are citizen of an European Union member nation, you may not use this service unless you are at least 16 years old.

  • You already know Dokkio is an AI-powered assistant to organize & manage your digital files & messages. Very soon, Dokkio will support Outlook as well as One Drive. Check it out today!

View
 

The University of Warwick - Use of e-assessment in medicine

Author: David Davies (david.davies@warwick.ac.uk)

JISC e-Learning Activity Area: e-assessment

Higher Education Academy Subject Centre: medicine dentistry and veterinary medicine

 

Case study tags: online learningan effect on learningan effect on student personal developmentinnovation in learning and teachingan influence on policyuse of resourcestangible benefits of e-learning,university of warwicke-assessment,medicine dentistry and veterinary medicine

 

Background & Context

 

Why did you use this e-learning approach?

 

In Warwick Medical School (WMS) we have adopted an assessment blueprinting approach to help match learning outcomes and competencies against examination question items. In Phase I of the WMS undergraduate medical course students are assessed at the end of each semester by written answer exams and objective structured clinical examination (OSCE). In the written exams integrated questions are coded against General Medical Council (GMC) competency themes. Students receive feedback on their written exam performance that guides them to plan learning strategies to help attain GMC competencies.

 

What was the context in which you used this e-learning approach?

 

WMS offers the largest graduate-only medical degree in the UK. The four year MBChB (Bachelor of Medicine and Surgery) programme is designed for graduates in biological, natural and physical sciences. We produce graduates with the clinical competence to work as Foundation Year 1 and 2 doctors with the understanding, intellectual skills, and attitudes to practice medicine along with the habit of lifelong learning necessary for a successful, continually developing career. UK medical schools are accredited on their ability to meet GMC standards for medical graduate knowledge, skills and attitudes. Assessment is a key part of learning and an integral element of assessment is feedback to the learner. The GMC have identified in Tomorrow's Doctors 'Feedback about performance in assessments helps to identify strengths and weaknesses, both in students and in the curriculum, that allow changes to be made'. The GMC have identified a number of curriculum competency areas that define the knowledge, skills, and attitudes of a graduate medical student.

 

In WMS we wanted to develop an assessment feedback system that gave students insight into their exam performance from the perspective of attainment in modules and in relation to GMC competency areas. We believe that giving students personalised feedback that addresses the GMC competency themes will help individuals to become better learners. Prior to taking this approach written assessments did not use integrated questions. Instead questions were based on module teaching and feedback was simply in terms of a score in each module. As modules are essentially arbitrary divisions of the course this approach did not encourage integrated thinking and neither did it seek to address the broader GMC competencies expected to today's graduate students.

 

What was the design?

 

The blueprinting approach was developed by a small number of key staff led by the Associate Dean for Teaching. Consultation during the blueprinting process involved all module coordinators. The written assessment sub-group of WMS Phase I teachers developed each integrated exam paper. The Associate Dean for Teaching and Course Director then coded these items against GMC competency areas as well as identifying which module was the logical 'home' for each question sub-item. The optically marked sheets for exam markers were developed in-house in consultation with the University's eLab IT team. The database for producing individualised feedback was created in-house. The WMS admin officer for Phase I of the course was a key player in developing the procedure for getting exam papers marked, scanned and processed. The Examinations Officer for WMS scrutinised the whole process.

 

How did you implement and embed this e-learning approach?

 

Moving towards a competency-based assessment was not natural for all teaching staff in WMS. There were 'hearts and minds' that had to be won to establish the need for this approach. For a start, not everyone was aware of the GMC competency areas, including module coordinators. Personal tutors were not initially equipped with the skills to provide support for their students once they had received their exam feedback. We did not have the necessary IT in place before starting this experiment, and although we have ironed out many bugs we are still actively developing our approach.

 

We have so far piloted this blueprinting approach on two end of semester exams, semester 1 (December 2006) and semester 2 (June 2007). Because of time constraints we had not arranged a training session for personal tutors the first time we ran this new feedback system at the end of semester 1. Consequently we had students unsure how to interpret their results and personal tutors who did not feel equipped to offer support. We learnt rapidly from this and made sure we arranged training sessions before feedback was given for the end of second semester results.

 

We plan to conduct a full evaluation of this new feedback system at the end of semester 3 when the system will have run for the full Phase I of the course. Preliminary results indicate that students welcome the new system as a way of thinking beyond modules.

Because of the requirement for timely processing of exam marks we are contemplating purchasing our own optical mark reader system. Currently we have been using the central IT Services system.

 

We recognise that to be truly successful in offering a competency framework for students that allows them to look across all four years of the course then we need to blueprint all assessments, not just written exams and not just in Phase I. We also want to develop an assessment item bank, a better automated feedback system (instead of email) and a competency element to our portfolio system. To that end we have bid for funds to create a new IT post within WMS to help develop these systems. Funds were from the Central University Education Innovation Fund. We hope to appoint the new post in the third quarter of 2007.

 

Technology Used

 

What technologies and/or e-tools were available to you?

 

By adopting our chosen blueprinting approach we knew we would face a number of technical challenges. Firstly our assessment sub-group coded each question and question sub-items against GMC competencies. This matrix of question items and competencies couldn't be managed on paper, so we developed a simple spreadsheet. Next we needed a method for recording each student's performance on integrated questions. To achieve this we devised record sheets for exam markers that could be scanned by an optical mark reader. This resulted in a spreadsheet of individual student scores for each question sub-item. We developed a database application to correlate exam marks against competency areas, calculating for each student their attainment in each theme. We also calculated class average attainment in each area. Students were emailed their overall scores in integrated questions along with attainment against competency areas matched against class average. We copied feedback emails to personal tutors so that they could help students interpret results.

 

Tangible Benefits

 

What tangible benefits did this e-learning approach produce?

 

We have yet to fully evaluate our new system but our aims are as follows, in no particular order:

 

  • To raise awareness amongst teaching staff and module coordinators about the importance of learning outcomes in the context of GMC competency statements.
  • To help each student to personally identify their own learning needs to attain the GMC competencies.
  • To help each student appreciate how the course is an integrated whole that helps them attain the necessary knowledge, skills and attitudes necessary to become a graduate doctor.
  • To develop the in-house technical skills to confidently handle complex data and to create appropriate systems to interpret and use data to help teaching and learning.
  • To improve the efficiency of exam marks processing through the use of OMR marking sheets.
  • To help students develop an enthusiasm for learning and to see it as a life-long activity.
  • To help prepare each student for work in the National Health Service.
  • To help distinguish the Warwick medical Course as being innovative and responsive to the needs of 21st century healthcare.

 

We have already been able to measure the impact of our approach as it has influenced assessment policy. Our assessment procedures manual is being updated as a result of this new approach.

 

Did implementation of this e-learning approach have any disadvantages or drawbacks?

 

We have so far not encountered any drawbacks to our new system other than the technical developments required to make it work in the first place. The reliance on optically marked exam marker's sheets is still a rate-limiting step but we hope to improve those with each iteration of the system.

 

There is clearly a cost (rather than a drawback) in that we have identified the need for a new IT post to help create the systems required to roll this out to the whole course and to train admin staff in WMS.

 

We can see no pedagogical drawbacks to providing students with better, individualised feedback on exam performance. Greater awareness of knowledge and skills will have a positive impact on student personal development.

 

How did this e-learning approach accord with or differ from any relevant departmental and/or institutional strategies?

 

This assessment feedback approach was developed alongside the strategic decision to move towards integrated exam questions. By using integrated questions we recognised the need to provide students with better feedback as it would not be easy for every student to relate performance on integrated exams to module teaching. So in that sense the development of this new feedback system was entirely complementary to the strategic goals of the institution.

 

When we were successful in winning development funds from the University centre, one of the selling points was that our system could in theory at least be adopted by any other department/course on campus, and that was perceived as an advantage.

 

Our new feedback system is now embedded, in that integrated questions coded against GMC competency areas are a routine part of Phase I written exam planning.

 

Lessons Learned

 

We are pleased with the results of our assessment feedback system to date as it is in complete accord with the move towards integrated exams. We believe that providing students with more than just their module scores, specifically providing them with competency feedback is essential to developing a better understanding of each learner's needs and enhances student personal development. The change in assessment procedures has influenced departmental policy.

 

We see a number of exciting possibilities for our new system including rollout to the rest of the course and in different exam modalities such as OSCE. Incorporating the competency approach with assessment feedback into portfolio does seem a natural development.

 

Finally a full-scale evaluation of the new system at the end of Phase I will allow us to involve students in future planning of the system as well as gather valuable data about how the feedback is being interpreted and use in practice.

 

Further Evidence

 

'In WMS we wanted to develop an assessment feedback system that gave students insight into their exam performance'.

 

'The change in assessment procedures has influenced departmental policy'.