UniSA

Showing posts with label quiz. Show all posts
Showing posts with label quiz. Show all posts

Tuesday, June 16, 2009

Postgrad Engineering - online course design and experience

Today Yousef Amer will be talking to us about his course redesign experience for a postgraduate course, Supply chain management.

When I started designing this online course my aim was to have my students (class size of ~70) to be engaged and to get to the point that they were interacting and sharing knowledge. To do this I used Gilly Salmon’s 5 stage model of teaching and learning online as described in these books – e-moderating and e-tivities.

The web site was simplified as much as possible. We separated the web site into two sections - Teaching and Learning. The Teaching section contained e-readers, journals, books, solutions and PowerPoint presentations. The Learning side contained the interactive elements –the discussion groups and quizzes.

Addressing Stage 1, “Access and motivation” the front page was welcoming and outlined all the main folders and features of the course site. Students were reassured and encouraged to take time to become familiar with the site and information available on it. There was one face to face lecture that demonstrated the online environment to the students. I walked them through the different components of the web sites, highlighting the critical components – 3 discussion boards. One was a lectures notice board, one was a group discussion board where students could only access their own group (~10 students) and the final one was a general discussion board. Students were shown where regular communication occurred. I also provided a consultation time for those who needed one-on-one support mostly for technical support.

Stage 2 is “Online socialization”. The first activity was to ask students to introduce themselves online and provide some background in relation to work experience and qualifications and what their aspirations were. The communication rules were also spelt out.

In Stage 3 “Information exchange” the students were required to respond to another’s post in their discussion group. This gave them the opportunity to access work from at least one other student or the entire group if they wanted to, and the chance to peer review. I used weekly online quizzes to allow the students to self-assess their knowledge of the topic and become aware of where the gaps were in their knowledge and opportunities for development. To build up and establish the groups online e-tivities were assigned two weeks apart.

Stage 4 relates to “knowledge construction” – and this is where the students were working in groups. Most groups were communicating quite well and some were already tackling the group assignment in an organised fashion.

This sense of “development’ relates to Stage 5. Students spoke more freely and expressed themselves more openly online as compared to in lecture and tutorial setting. Communication was also more immediate with some students checking the discussion board daily.

As a online teacher it was an interesting experience. You could see the groups forming through the discussion board to become active and supportive of each other – finishing by exchanging telephone numbers and meeting times. For me it was important to keep track of their contributions so that the ‘quieter’ students didn’t get left behind. This was a bit tricky at times, especially as some students have similar names. Those students who weren’t making postings were encouraged by other students and if they disappeared for too long were emailed by myself. Concerns about their expression of English was common – but after I reassured them that they were not being assessed on spelling and grammar, that they became more active in the discussion.

The student evaluations of these learning experiences has been very positive particularly the way that this learning can be applied in real situations.

Friday, April 3, 2009

A conversation with Peter Kentish about Moodle quizzes

I cornered Peter Kentish recently, gave him coffee, a comfy chair with a cushion and asked him to recount  some of his experiences about using Moodle quizzes to share with our readers. 

Q: How come you are using Moodle Quizzes now?
It all started in 2007 when the new common first year in engineering began. I had always taught my old Engineering Materials course with a cohort of 60-70 students using 6-7 multiple choice and text-based questions but now, with the new common first year core course it was going to be 200! There was no way I was going to be able to do all that marking! I knew  that the quiz approach had worked well for motivating my students to learn consistently in this course so I asked for an automated system to help me mark these tests. 

It wasn't the first time I had asked for an automated assessment system. The UniSAnet team tried several times but were quite frustrated in their efforts to create an in-house 
online assessment tool and resorted to trialling the Moodle version. I was one of 2 people who were allowed to use this environment and test it for the University. 

How do you use it?
I run 6 tests during the study period for this course. For each these there is actually a pair of quizzes. The first is a practice (non-assessable) one that students can attempt as many times as they like, find out which they got right or wrong and also what the right answer is. Then there is an assessable version that counts for grades.  There about 30-35 items in the assessable one and 15 or so in the practice version. 

A screen grab from the Moodle quiz environment
 
Students do the assessable quizzes in a computer pool during their tutorial time. I set the time the quiz is open for and a password - I make these up so they are different for each tutorial class. When all the classes have finished taking the test I send them their results which have been automatically marked. 

What has it been like using Moodle quizzes?
The system itself has been quite good - robust.  There have been virtually no bugs in the system - just one that hasn't been solved. This has no negative effects provided you are aware of it.

Like most things there are positives and negatives - there has been a lot of time setting up the database of questions, but the time saved with the auto marking has more than made up for it - particularly as I am now in the second year. The benefit will continue for future years as less time is required in maintaining the database.


How do you come up with your questions?
When I first started I had my old questions that needed to be entered into my database which took a long time. Now, when I want to set up a test I create a shell for it (setting the time and password for the class) and select the questions I want from my database. Each year I choose different questions. I also use the random shuffling functionality - my selected questions are presented in a different order and answer options are also in different order for each student (this shuffling feature can be turned off). 

Also, at tutorials I get ideas for questions where the content is applied to problems and write new questions to add to my database. This is good as I really want to test students application of knowledge rather than recall of memorised facts.  

Also, tutorials are not assessed directly, so to motivate students to engage with the tutorial activities I warn them that a certain amount of the quiz items are going to come from the tutorials so they need to do this work (i.e. memorisation of facts will not be enough to be successful - you are not very good as an engineer if you are not able to apply knowledge to solve problems!)
An example of a question from a practice quiz

What do the students think?
They generally like MCQ best - they think they have an advantage as the right answer is there they just need to find it.  For the first practice quiz there were over 958 attempts - some students attempted it 15 times, others not at all. The average mark for the test was about 70%. 
Even though the students were advised that quiz 2 would be more complex than the first, the number of attempts on the practice test did decrease to 795 -probably because the students have more competing interests or they have been lulled into a false sense of security from test 1. The average mark for this second test, which was more difficult,  was  lower - around 60%.