UniSA

Friday, April 3, 2009

A conversation with Peter Kentish about Moodle quizzes

I cornered Peter Kentish recently, gave him coffee, a comfy chair with a cushion and asked him to recount  some of his experiences about using Moodle quizzes to share with our readers. 

Q: How come you are using Moodle Quizzes now?
It all started in 2007 when the new common first year in engineering began. I had always taught my old Engineering Materials course with a cohort of 60-70 students using 6-7 multiple choice and text-based questions but now, with the new common first year core course it was going to be 200! There was no way I was going to be able to do all that marking! I knew  that the quiz approach had worked well for motivating my students to learn consistently in this course so I asked for an automated system to help me mark these tests. 

It wasn't the first time I had asked for an automated assessment system. The UniSAnet team tried several times but were quite frustrated in their efforts to create an in-house 
online assessment tool and resorted to trialling the Moodle version. I was one of 2 people who were allowed to use this environment and test it for the University. 

How do you use it?
I run 6 tests during the study period for this course. For each these there is actually a pair of quizzes. The first is a practice (non-assessable) one that students can attempt as many times as they like, find out which they got right or wrong and also what the right answer is. Then there is an assessable version that counts for grades.  There about 30-35 items in the assessable one and 15 or so in the practice version. 

A screen grab from the Moodle quiz environment
 
Students do the assessable quizzes in a computer pool during their tutorial time. I set the time the quiz is open for and a password - I make these up so they are different for each tutorial class. When all the classes have finished taking the test I send them their results which have been automatically marked. 

What has it been like using Moodle quizzes?
The system itself has been quite good - robust.  There have been virtually no bugs in the system - just one that hasn't been solved. This has no negative effects provided you are aware of it.

Like most things there are positives and negatives - there has been a lot of time setting up the database of questions, but the time saved with the auto marking has more than made up for it - particularly as I am now in the second year. The benefit will continue for future years as less time is required in maintaining the database.


How do you come up with your questions?
When I first started I had my old questions that needed to be entered into my database which took a long time. Now, when I want to set up a test I create a shell for it (setting the time and password for the class) and select the questions I want from my database. Each year I choose different questions. I also use the random shuffling functionality - my selected questions are presented in a different order and answer options are also in different order for each student (this shuffling feature can be turned off). 

Also, at tutorials I get ideas for questions where the content is applied to problems and write new questions to add to my database. This is good as I really want to test students application of knowledge rather than recall of memorised facts.  

Also, tutorials are not assessed directly, so to motivate students to engage with the tutorial activities I warn them that a certain amount of the quiz items are going to come from the tutorials so they need to do this work (i.e. memorisation of facts will not be enough to be successful - you are not very good as an engineer if you are not able to apply knowledge to solve problems!)
An example of a question from a practice quiz

What do the students think?
They generally like MCQ best - they think they have an advantage as the right answer is there they just need to find it.  For the first practice quiz there were over 958 attempts - some students attempted it 15 times, others not at all. The average mark for the test was about 70%. 
Even though the students were advised that quiz 2 would be more complex than the first, the number of attempts on the practice test did decrease to 795 -probably because the students have more competing interests or they have been lulled into a false sense of security from test 1. The average mark for this second test, which was more difficult,  was  lower - around 60%. 

1 comment:

  1. If you have any questions for Peter about Moodle quizzes (which is what we will be using in the future) please ask by making a comment using the field below. If you are not logged in with a Google Account please add in your name as well. Many thanks
    Diana

    ReplyDelete