Considered to be one of the top UK institutions to study veterinary medicine in the UK[i], the historic Royal Dick School of Veterinary Studies has built a reputation for providing the highest quality learning and teaching to its undergraduates.
The School runs two undergraduate programmes, lasting either four or five years. On average each student will sit at least two summative exams per course annually – and these will be a combination of multiple choice (MCQ), short written answer and practical assessments (OSCEs), with the majority being multiple choice exams.
Overtime a large number of ‘do-it-yourself’ type question banks supporting the MCQ exams had sprung up across the programmes. These banks were proving to be nearly impossible to manage across the entire curriculum and importantly, they did not generate item statistics easily. Not knowing which questions were performing well and which needed work, meant that the School did not have the necessary information to make improvements to their assessments.
It was apparent that it was time to find software that would assist the School to achieve their objectives, rather than hold them back.
As the manager of the Veterinary Training Organisation at the School, Lindsay Dalziel focusses on improving the assessments across the undergraduate programmes. Knowing how powerful it would be to have visibility of question and exam performance, Lindsay began the project to overhaul the Schools approach to question banking a couple of years ago. The visibility she sought would give the School the opportunity to improve questions and weed out the questions that were underperforming. And in the process to lay the foundations for improving assessment outcomes – both from a student’s and an academic’s perspective.
Initially, the School tried two different sets of question banking software. However, neither was taken up widely, as the user experience didn’t meet expectations. Lindsay persisted with the project and continued to look for an alternative that would meet everyone’s needs.
A trial of Speedwell’s eSystem software demonstrated quickly that it would deliver what they needed – an easy-to-use question bank with robust, inbuilt item analysis. Additionally, administrators and academics would be able to edit questions from anywhere, liberating all involved from issues with version control that had plagued word and excel question banks.
The added bonus for the School was the tight integration between eSystem software and their existing installation of MultiQuest software. MultiQuest allows uses to prepare and manage paper multiple choice exams – and using both sets of software together meant that the School could deliver both online and paper assessments and still generate all the vital question statistics and review and store them in the same eSystem bank.
Lindsay is overwhelmingly positive about how easy it was to move existing questions into their new software. “Importing the questions wasn’t a difficult task,” says Lindsay. “In fact, with some minor tweaking to formats, we imported all our questions from our excel banks easily. It was a slightly bigger job importing our questions from word – but with the help of a temp who reformatted the content, we were able to import all of them too.”
Lindsay’s team took a staged approach to implementation. At the beginning, the software was used for generating exams for a couple of courses only and the associated exam papers were printed off as PDFs for the students to use alongside standard marking sheets. The sheets were then scanned using an OMR scanner and the results were uploaded from MultiQuest into the eSystem as raw scores.
More recently the School has begun to deliver more and more of its exams online too, which is making life easier for the team – especially when there are students undertaking rotations at remote vet practices.
As a consequence, the School has built up a series of performance statistics, which the academic staff are using to improve their assessments.
The combination of running paper and online exams and having visibility of the vital question statistics in one tool, is bringing the anticipated benefits to the School. Lindsay says “What the eSystem has given us is more time – we can now see at a glance where improvements needed to be made, whereas before we either didn’t have the data or had to work it out manually – and that could take some time cobbling it all together. eSystem software turns our question data into actionable insights.”
The School has ambitious plans for the use of the software – inspiring all involved in assessment in the undergraduate programme to use the eSystem’s exam blueprinting function extensively. With a bank of statistics being built up on each questions outing, the team hope to demonstrate how using the blueprinting function will further align learning outcomes with the assessment – driving stronger outcomes for all involved.