Jisc case studies wiki Case studies / Leeds Metropolitan University - Use of summative computer assisted assessment in Applied Technology
  • If you are citizen of an European Union member nation, you may not use this service unless you are at least 16 years old.

  • You already know Dokkio is an AI-powered assistant to organize & manage your digital files & messages. Very soon, Dokkio will support Outlook as well as One Drive. Check it out today!

View
 

Leeds Metropolitan University - Use of summative computer assisted assessment in Applied Technology

Author: Steve Jones (s.r.jones@leedsmet.ac.uk)

JISC e-Learning Activity Area: e-assessment

Higher Education Academy Subject Centre: hospitality leisure sport and tourism

 

Case study tags: an effect on exam resultsan effect on student personal development,innovation in learning and teachingstaff satisfaction with e-learninguse of resourcestangible benefits of e-learning,leeds metropolitan universitye-assessmenthospitality leisure sport and tourismonline learning

 

Background & Context

 

This example looks at the use of CAA on a module with large numbers of students. The assessments comprise 5 stage tests worth 10% each and an end exam worth 50%. Tests have both formative and summative aspects. Drivers for developing this approach were to give quick feedback to students from early on; to introduce them to HE assessment processes; and to save staff considerable time marking 350 scripts.

 

Why did you use this e-learning approach?

 

This approach was adopted on courses in Events Management and Tourism Management for a number of reasons. These included the University assessment, learning and teaching strategy that stressed the use of computer assisted assessment (CAA). One of the specific targets of the strategy is to 'expand the use of computer aided assessment for both formative and summative assessment, with at least one appropriate and well designed CAA assignment in each undergraduate course by 2007'.

 

Additionally, we wanted to give early and rapid feedback to level 1 students in semester one as to their progress in a higher education environment.

 

Finally, we wished to introduce students to the rigid deadlines demanded by the University against the expectations of a number of students arriving from further education, where deadlines for assessments could be negotiated.

 

What was the context in which you used this e-learning approach?

 

This approach to e- learning is used on the module 'Applied Technology and Finance' taken in semester one of level 1 by students on both tourism and events management courses. There are a large number of students taking this module (in 2006-07 around 350 students were enrolled on this module) with a proportionate number of staff looking after workshop/seminar groups.

 

Finance and the use of technology subject areas can often cause some stress and difficulties to students who may have struggled at school with numeracy and IT literacy.

 

Prior to the adoption of this approach, assignment submission was on paper and took a considerable amount of time to mark and feedback to students. In addition to this, it was not possible to give the students multiple opportunities to receive feedback; the students completed one end of module assessment and the mark and feedback was returned three to four weeks later.

 

This approach to e- learning has presented few problems or challenges in terms of implementation since the drivers for change came from staff teaching on the module. These staff subscribed to the principles of rapid feedback and frequent feedback described in the introduction to this approach. There was some concern around the final assessment involving all 350 students and whether the virtual learning environment (VLE) would cope with this level of traffic.

 

What was the design?

 

The students were introduced to this approach at the very beginning of the module with full details of what was expected of them detailed in a module handbook given out in paper form and available on the VLE. The students were inducted into an approach that asked them to complete prescribed reading and activities for each week and supplied them with supplementary activities and reading that they were encouraged to undertake. Every two weeks starting at week three the students had to take a test (summative assessment) with the questions based on the activities of the previous two weeks along with a small number of the questions based on the supplementary reading and activities. The students received their mark immediately on completion of the test and feedback was made available at the end of the week when the tests took place. At the end of the module a formally scheduled exam was held using the VLE which all students took on the same morning.

 

During the initial development of this approach the design process was led by WebCT champions who also happened to be members of the module team. Subsequently all tutors on the module have taken part in the incremental development of this activity and the dissemination of this approach.

 

How did you implement and embed this e-learning approach?

 

The roll out of this approach was relatively straightforward with few issues. No other staff other than the module team were involved and the students were thoroughly briefed before assessments and before the end exam. The VLE support team did however monitor the impact on resources of the final exam; this proved to be remarkably low.

 

Initial evaluation involved meetings of the module development team on a regular basis to focus on implementation issues and to discuss the outcomes of any discussions with students on this approach. In the initial pilot of this approach students' opinions were sought on a regular basis.

 

In initial trials of this approach a number of students missing tests asked tutors if they could take the test the following week. Since the desks were akin to examinations this was not normally possible, but the decision was taken to allow students to take the tests during the week that their normal class took place. In the first week the students could retake the test just by asking, but in subsequent test sessions only plausible excuses were accepted culminating with the need for the student to submit mitigation for their absence from the final test. The main reason for this was that a number of students were not used to strict University processes and the team wanted the chance to reiterate this over a number of weeks.

 

Although the tests and the final exam were held in exam like conditions, a small number of students complained that they were aware of students overlooking the computer screens of other students taking the test. This was made possible because initial tests and exams were presented to all students in the same way. After this complaint, question banks were extended and questions were selected at random by the VLE - typically this meant that students would get 20 questions presented randomly from a bank of 30 questions. At the same time the decision was taken to randomise the order that the potential answers appeared in for each multiple choice question. Interestingly, large number of students commented on this approach and the initial complainants noted that students said it was virtually impossible to look at answers on the computers of those around them.

 

Technology Used

 

What technologies and/or e-tools were available to you?

 

The University uses WebCT Vista as its VLE. Whilst there is no compulsion to use this VLE, no other technologies were considered. The module team used this approach quite simply because all the key technologies needed were in place and ready to use. This approach used the assessment tools within WebCT Vista. In addition to this some members of the module team used Respondus to facilitate the writing of questions. Familiarity with these tools was also an important factor since both staff and students would either be fully familiar or be expected to gain familiarity with these tools for use in other modules.

 

Tangible Benefits

 

What tangible benefits did this e-learning approach produce?

 

This approach produced a good number of tangible benefits for all involved. For the institution this was the first large-scale test of the VLE when used for computer assisted assessment. This has proved important for the achievement of the strategy target outlined at the top of this case study.

 

For the point of view of the module tutors the most tangible benefits were seen at the end of the module. Marks for all 350 students were submitted within three hours of the completion of the exam. This could be potentially reduced but the module team took the decision to review statistics for each question in each test and exam to ensure that they were appropriate in terms of wording and discrimination. In previous years marking and standardisation of paper-based scripts from 350 students took a considerable amount of time. Even if each paper based script took 20 minutes to mark, it might have taken a total time of 115 hours... add moderation/standardisation times and the process could easily take 120 hours. With this approach the time spent is no more than 20 hours including exam supervision (by the course team) and checking results at the end. These savings in cost and time are now apparent after three years of developing this module in this form, but increased use of reusable learning objects and question banks could reduce initial development time. It is important to recognise the need for this upfront investment of time and for institutions to take account of this when looking at this type of CAA.

 

The delivery of tests every two weeks had the benefits of giving rise to a marked improvement in attendance during weeks where tests took place and to a lesser extent between the tests. The results were sent to personal tutors immediately after each set of tests that have the benefit of allowing personal tutors to see how their tutees were progressing, to follow up on any absences and head off early attrition.

 

Perhaps the biggest improvement was in the mean marks for the students of the module. Prior to the implementation of this approach a typical mark is around 53%, rising to 63% in 2006. The biggest area of benefit was in the final exam where a decision had been taken to put all previous years exams online along with full feedback and allow students to take these practice exams as many times as they wished. Those taking advantage of this had an average exam mark 15% higher than those who did not.

 

One important qualitative benefit of this approach has been that students are less apprehensive of taking a Finance subject since they get regular feedback on how they are doing.

 

Savings in cost and time are now apparent after three years of developing this module in this form, but increased use of reusable learning objects and question banks could reduce initial development time.

 

Did implementation of this e-learning approach have any disadvantages or drawbacks?

 

It should be noted that testing and preparation of large numbers of questions has a very considerable upfront outlay of time and effort over and above the normal deployment for the module - this came out of the personal time of staff teaching on the module but three years after the initial pilot this is now paying off. As this module is made available to a wider number of students the point that little additional effort will be required will be an important benefit to the tutors teaching on this module.

 

One interesting problem of this approach is that any new staff teaching on the module require an induction into the approach and some development in terms of producing questions and understanding the overall process of ensuring quality. However, it is important that this is not seen as a one off because the skills can be applied in other modules.

 

It should also be noted that the module team consider that this approach is best suited to level 1 students and that there would be considerable difficulties in testing level 3 students. Although this may be possible, efforts to extend computer assisted assessment are currently focused at level 1.

 

How did this e-learning approach accord with or differ from any relevant departmental and/or institutional strategies?

 

This approach is in line with departmental and institutional strategies. However, it does vary from discussions of how these strategies might be implemented in that initial expectations were that computer assisted assessments would be formative in nature rather than summative. The development team however considered that due to the strategic approach to learning often seen in these level 1 students that significant numbers of students would not engage with frequent formative assessment since it carried no apparent credit. The high levels of engagement on this module at least indicate that small amounts of summative assessment spread over the module is something students will engage with very successfully in their first year.

 

Implementation of approaches such as this at departmental or institutional level is not straightforward or appropriate for all modules. This approach works best in modules with large numbers of students where the initial outlay of time and effort will have a bigger pay off at a later date. It also assumes either a reasonable level of skill in module tutors or the presence of learning technologists within either departments or institutions. As staff within the faculty acquire the skills and knowledge of approaches and adapt their modules to allow this then embedding these approaches in teaching becomes possible - it is likely that more of this approach will be adopted during the year 2007-08.

 

Lessons Learned

 

Summary and Reflection

 

This approach has been, on the whole, effective in pedagogical terms. It has delivered a number of important benefits for students, but equally the approach has some drawbacks such as one type of feedback not necessarily fitting all students.

 

In terms of the institutional assessment learning and teaching strategy this is important and certainly fits the target noted in the initial paragraphs of this case study. However as an example it is a little problematic since some staff see the size and coverage of the assessment within this module as being difficult to emulate. Within the faculty (and to some extent within the institution) there is a need for further small-scale use of formative computer assisted assessment-certainly this would be an easier goal to achieve.

 

Much of the learning from this approach is apparent within the case study since evaluation and review of this approach has been ongoing during the running of the module and in a full module review after the module has been run by the year. Where appropriate the module team reacted to issues and implemented solutions as the need arose.

 

Future developments are likely to include broadening the range of question types, but are unlikely to include further development of computer assisted assessment within this module. However, staff teaching on this module are considering using this approach in other modules.

 

Further Evidence

 

'Marks for all 350 students were submitted within three hours of the completion of the exam... In previous years... the process could easily take 120 hours.'

 

'This approach works best in modules with large numbers of students where the initial outlay of time and effort will have a bigger pay off at a later date.'