Posts Tagged ‘Clickers’

Personal Response Systems: Review of the Turning Technologies User Conference 2012, Aarhus University, Denmark

August 17, 2012 4 comments

Two months ago I attended the Turning Technologies User Conference 2012 at Aarhus University, Denmark, the first of its kind in Continental Europe (following the success of last year’s UK conference, reviewed here). Turning Technologies manufactures the electronic Personal Response Systems (PRS) or ‘classroom clickers’ that we use at City University London to poll students’ responses to specific questions posed during lectures, so I was keen to learn more about how other users internationally deploy this technology in their teaching.

A brief outline of each of the sessions is given below. The conference agenda, including abstracts for each of the presentations, is available here and the full conference programme (which was combined with Aarhus University’s ‘Frontiers in Science Teaching’ conference to create a two-day event) may be downloaded here.

Keynote – ‘Turning Lectures into Learning’ (Eric Mazur, Harvard University)

Following a welcome from Michael Broderik, CEO of Turning Technologies, the day opened with a keynote presentation by Professor Eric Mazur, whose ground-breaking teaching method of ‘peer instruction’ has brought him international recognition. He discussed how he developed peer instruction during the early 1990s as a response to the problem of the transmissive nature of traditional lectures, making lectures more interactive by placing students at the centre of their learning and thereby fostering a deeper level of understanding. In brief, the process is that a key conceptual question is posed; without conferring, the students vote for the answer they believe to be correct; they are then invited to discuss their answer amongst themselves in small groups; finally, the poll is taken again to see if more students have been persuaded towards the correct answer by their peers’ explanations. Professor Mazur illustrated his method with several worked examples, including, towards the end of his presentation, one from his teaching in ethics to demonstrate that peer instruction was not applicable exclusively within pedagogical contexts in which there is a definitive answer.

‘Writing good great Exceptional Clicker Questions’ (Siara Isaac, EPFL, Lausanne)

‘Museum Studies Using TurningPoint’ (Mikel Asensio, Universidad Autónoma de Madrid and Elena Pol, Interpretart)

Then followed the first of three parallel breakout sessions. In an interactive presentation, Siara Isaac invited the audience to write and revise PRS questions to turn them progressively from good, to great, to exceptional questions that nurtured deep-level understanding. Her discussion was informed by common mistakes in the authorship of multiple choice questions, but she also argued that to write exceptional (rather than merely good or great) PRS questions, a more creative approach may be necessary. Meanwhile, in the room next door, Professor Mikel Asensio and Dr Elena Pol discussed their use of personal response systems within museum studies. They noted that the information traditionally provided in museums was often quite weighty (for example, large amounts of printed text mounted on walls) and that this is not particularly engaging or interactive for visitors. As a solution, their institution has taken to using personal response systems to stimulate their guests’ interest in their collections as well as to gather important demographic information about them.

Interaction in Lectures with Mobile Devices (Will Moindrot, University of Manchester)

Using Electronic Voting Systems in the Arts and Humanities (Christopher Wiley, City University London)

After lunch, Will Moindrot discussed the logistical challenges presented by the use of personal response systems in lectures involving large numbers of students (it was particularly interesting to have considered this perspective given that personal response systems are often cited as being a means of dealing effectively with large-group teaching). He reported back on the students’ experiences of the solution implemented at the University of Manchester, namely the use of ResponseWare technology (a good explanation for which is to be found here) to enable students to vote using their own mobile devices without the need to be supplied with a bespoke handset. The concurrent session was my own presentation on using personal response systems in the arts and humanities. I (Dr Christopher Wiley) argued for the potential of PRS to enhance teaching in areas other than the traditional sciences, for instance, by soliciting audience opinion on a contentious point (with the aim of nurturing debate and generating arguments for and against prior to a repoll), or asking ‘subjective’ questions that stimulate discussion among students in that there may be more than one valid or correct answer. My presentation was illustrated by examples drawn from my teaching as a music lecturer who has used PRS for the past four years, as well as feedback received from my students.

‘Diagnostic Processes In General Practice’ (Lars Bjerrum, Copenhagen University)

‘Improving Practice and Addressing Practicalities: Embedding Audience Response Systems at the University of Kent’ (Daniel Clark, University of Kent)

In the final breakout session of the day, Professor Lars Bjerrum explained how personal response systems may be used to illustrate different approaches to the diagnostic process in general practice (as distinct from the diagnostic process within the context of a hospital). Such approaches include pattern recognition and deductive reasoning, and his presentation referred specifically to patients in primary care. Next door, Daniel Clark discussed e-learning strategy at the University of Kent in relation to the use of personal response system technology, which was piloted there five years ago. He spoke about positive feedback received from staff about the pedagogical value of PRS, current practices at the university (for instance, using PRS to facilitate revision sessions), the challenges posed by the embedment of this new technology in teaching and the solutions that were implemented, and the means by which PRS is promoted to staff on a continuing basis through training sessions (see here for further information). His presentation yielded an insight into the strategy of a single higher education institution as well as offering helpful guidance to others seeking to implement similar initiatives within their own contexts.

‘Turning to Your Neighbour’ (Julie Schnell, Harvard University)

The day concluded with a follow-up session exploring peer instruction, led by Dr Julie Schnell, who amplified the specific concept of ‘turning to your neighbour’ which is at the heart of the method. Through worked examples, she discussed the benefits and drawbacks of two fundamental questions relating to its implementation: whether or not to take an initial vote before inviting the students to discuss a given question with their neighbour; and at what stage in the peer instruction process to display the poll results to the audience. Professor Mazur’s opening keynote had already given us experience of some of these different approaches, for instance, revealing the results of the initial poll to us immediately for one question but withholding them for another.

The conference also benefitted from a number of poster presentations, including ‘Adding Value To Your Handsets – Making Video Interactive’ (Sue Palmer, Empowering Confidence) and ‘Clicking Your Way to Research Data’ (Sue McMillen, Buffalo State College). All in all, it was a highly informative event and a valuable opportunity to network with people implementing personal response systems in a variety of technology-enabled teaching settings, and to share thoughts, practices, and solutions.

Turning Technologies User Conference 2012, Aarhus University, Denmark

Use of the Personal Response System for Formative Assessment in Optometry: Dr Byki Huntjens and Dr Steve Gruppetta

With the recent founding of the University Personal Response System (PRS) Steering Group, co-chaired by Dr Siân Lindsay and Farzana Latif, this would seem to be an opportune time to profile one of the innovative approaches implemented within the University in using PRS technology for formative assessment.

Dr Byki Huntjens and Dr Steve Gruppetta are lecturers in the Division of Optometry and Visual Science who have introduced the PRS to undergraduate students in order that they may receive immediate classroom feedback during Clinical Skills and Optics lectures. A PRS handset is given to the students (against a small deposit) throughout their degree programme, and is registered to their name to enable responses to be matched to individuals. Each lecture features a succession of multiple choice questions (MCQ). Byki’s practice is to start later lectures with a set of MCQs covering the previous topic plus the background reading for the class, and test the students’ understanding of the new topic later on during the lecture. Steve includes material that potentially encompasses the previous lecture, the current lecture, or even paves the way for a new topic to be discussed. The end result is a series of technology-enabled formative assessments.

Although only the group scores are shown during lectures and the progress of individual students is not revealed, the results of the quizzes are uploaded to Moodle each week by topic and the students are thereby able to check their individual score. This enables them to track their progress over time, and doubles as a reminder of the topics to which they need to direct particular attention prior to the examinations. The Moodle grade book also shows the students’ ranking among the whole group, leading some of them to become slightly competitive. Indeed, the element of competition is actively nurtured – the top five students with the highest marks in the year are awarded a prize at the divisional Prize Giving event.

The students have shown excitement during the PRS quizzes and appreciate the immediacy of the feedback, the anonymity of the process, and the way that it articulates the lecture by providing an interlude. Steve has developed the practice of making the PRS quizzes, which he calls the ‘Optics Challenge’, distinct from the rest of the lecture by changing the background of the slide from white to black (see screenshot below). The students’ responses are also used by the tutors to adapt subsequent lectures to the level of understanding of the specific cohort; this has prompted a change of direction on several occasions. In addition, this information has enhanced the support that the tutors are able to offer when students have sought extra help.

The Optics Challenge Leaderboard

Byki delivered a presentation on the use of PRS technology for formative assessment at the Fourth Annual ‘Learning at City’ Conference on 13 June, 1.20-2.00pm (the video is available here).

Christopher Wiley, Byki Huntjens, and Steve Gruppetta
with thanks to Siân Lindsay and Farzana Latif

Innovation in Assessment and Feedback

April 20, 2012 2 comments

My dual role as University Learning Development Associate in Assessment & Feedback and Senior Lecturer in Music has led me to run several pilot projects in my teaching this academic year (2011-12), exemplifying innovative approaches to the practices surrounding assessment and feedback. Three case studies are given below.

(1) Using wikis in Moodle to track progress on undergraduate dissertations and deliver formative feedback

Last term I set up an wiki template in Moodle to provide each of my final-year undergraduate dissertation students with a resource that both of us could access and periodically update, for the purposes of tracking progress on their dissertations and offering formative feedback on draftwork submitted.

Major Project wikiThe wiki includes pages for the project’s working title, and a separate page for each of the meetings divided into sections for the date of the meeting, a summary of what was discussed, objectives agreed for next time, and the date of the next meeting (see screenshot, right). It was developed owing to the need to help undergraduate students keep on-track in their dissertation work at a critical time in their programme, and was inspired by the Moodle wiki previously set up for the purposes of recording undergraduate Personal Development Planning (PDP) as well as the University’s use of Research And Progress for postgraduate research students.

One student has engaged with this resource to the extent that he has created several new pages to record his ongoing progress in between supervisory meetings; the nature of the wiki is such that I can review his progress at any time and add suggestions or make revisions as needed. Another student always brings her Wi-Fi enabled laptop with her so that we can make updates to the wiki during our tutorials. Whenever one of us makes and saves a change, the other can instantly see it on their screen, which demonstrates the value of using mobile devices to support student learning – particularly as this student now takes the lead at the end of each supervision in ensuring that the wiki has been fully updated.

This would seem to be a helpful way of time-managing the task of researching and writing a dissertation, not least given that it is a challenging process that final-year undergraduates may be encountering for the first time. It also provides a concise and useful reminder (for supervisor as well as student) of discussions, progress, and objectives set at each meeting, while enabling them to take ownership of their learning. This pilot will be rolled out across the entire module next year and all final-year Music students will be expected to use it; there is also much potential for initiatives of this nature to be extended to other programmes and subject areas.

(2) Curriculum design developed in dialogue with the students: elective assessment components

One innovative assessment model that I have been developing for much of this academic year involves giving students some choice as to how they wish to be assessed. Consultation with senior academic staff within and beyond the University has identified that, while such practices are more logistically complex, it should not be supposed that there is only one way to assess students against a prescribed set of learning outcomes necessarily.

After considering several possible assessment patterns which were discussed with colleagues, I settled on the following model which essentially preserves the 30:70 ratio (standard across the institution) between the minor and major assessment points:EVS graph

  • 1 Written Examination (unseen): 30 marks
  • 1 Elective Assessment: 30 marks – the student chooses ONE of the following options:
    • Written Coursework
    • Oral Presentation
    • Musical Performance accompanied by Written Documentation
  • 1 Project developed from the above Elective Assessment: 40 marks

The Examination provides a common component for all students, irrespective of the pathway they choose for the Elective Assessment. The other assessments have been specified mindful of parity with existing module assessment patterns. The benefits to students are that the initiative enables them to play to their strengths, and to influence how they wish to be assessed and how they wish their marks to be apportioned. The Elective Assessment also permits an additional opportunity for interim feedback ahead of the final Project.

My consultation with the students as to whether such an innovation would be welcomed was revealing: the graphical result (above right) of a poll conducted anonymously using EVS handsets (clickers) speaks for itself.

The focus group that comprised 12 students in my class were also consulted on several other major points of curriculum design, including the content and schedule of the lectures as well as the manner in which they will be taught, assessed, and feedback delivered. They have decided upon all of the lecture topics themselves via a Doodle poll, and have been invited to write supplementary assessment criteria using a wiki; elements of self- and peer assessment will also be included in the module. Having discussed several different forms of feedback (written, dialogic, telephone, podcast, screencast) at the focus group, 33% of students said that they would prefer written reports, while fully 50% opted for dialogic feedback – an unexpected but welcome result.

(3) Student self-assessment of in-progress writing of a research dissertation

Earlier in the year, one of my senior postgraduate research students submitted a draft of a dissertation chapter to me in the knowledge that while some sections were complete, others would need revision either because she felt that they would benefit from further work or because she had yet to complete the research (largely ethnographic, for which she is entirely dependent on the availability of her study participants) that would enable her to finalize her writing.

Since I nonetheless wanted to give her feedback on her work in progress, I formulated the idea of suggesting to the student that after a couple of weeks she should return to the draft chapter herself to reflect upon her writing, and to embed comments electronically using Microsoft Word to identify sections where she felt that further revision would be necessary and to explain why. I would then overlay my own feedback in a similar manner.

In being able to review draftwork that the student had herself annotated, I found my attention being much more effectively directed towards the parts of the chapter upon which it was most fruitful to focus. I felt that I would have made many of the same comments as the student herself, and this means of reflection also enabled the student to ask further questions of her work that I was then able to respond to, and for us to engage in a form of written dialogic feedback (see screenshot below).

The student likewise reported that she found it very useful to return to her chapter in retrospect, and particularly to document the areas she believed required additional work. This is a model of self-reflective feedback that I am now seeking to adopt for future research students.

Dissertation feedback sample

Dr Christopher Wiley

A fresh take on the classroom clickers (aka ‘PRS’)

October 28, 2011 1 comment

Just yesterday I attended and presented at the first European Turning Technologies User Conference at the University of Surrey. Turning Technologies are the US-based manufacturers of the classroom clickers / PRS that we use here at City. The keynote for the day was none other than Harvard Professor Eric Mazur, developer of the ‘Peer Instruction’ model of teaching and learning. His keynote and all other recordings from the day are on YouTube here

What I presented on:

I talked about and demonstrated a TurningPoint technology called ResponseWare  (RW) and discussed how I had piloted this with Cengiz Turkoglu and his students from the School of Engineering and Mathematical Sciences (SEMS) at City last year. RW basically enables students to vote with their mobile devices as opposed to the regular clickers – this technology brings with it many practical and pedagogic benefits (but also some challenges too!). I also talked a bit about how we knew that our students weren’t really happy with the idea of using their mobile devices in class, but trialled RW with them nonetheless! The slides from my presentation which explained what happened can be found here

ResponseWare at the University of Surrey

Paul Burt and his colleague Ceri Seviour at the Centre for Educational and Academic Development (CEAD) at the University of Surrey gave a presentation on their experience of using RW. They started off with giving some background about their highly successful clicker library loan system, which has seen zero losses in clickers since it was introduced 4 years ago! However, they explained that one problem they faced was that students perceived the non-return fine of borrowed clickers as a significant risk – this was reducing engagement in the technology. Further, they knew that over 80% of their students owned some kind of internet-accessible mobile device so why not make use of these? A further justification for using RW was that it removed the headache of worrying about mixing channels (seasoned clicker users will know what I’m talking about here!).

Paul and Ceri then discussed their very recent experiences of using RW:

  • They piloted with 80 RW licences initially, now have 450 (since Sept ’11). Each RW licence is available for 2 years and is based on concurrent use.
  • They use a simple Outlook calendar to manage the distribution of licences across the University, this is working well and since the start of the semester they have had several lectures (sizes ranging from 100 – 350 students) using RW
  • They were given significant IT support throughout and this was highlighted as key to the success of the project  – this was necessary in order to troubleshoot problems relating in particular to the WiFi
  • They found that students own many more mobile devices than they had anticipated and many had non-English configuration (which made running around trying to troubleshoot on Chinese language devices almost impossible!)
  • They did experience performance issues which are as yet undiagnosed but they are working on it, e.g. students experienced quite a significant lag in displaying the polling results on their mobile device screens. It remains unknown whether these issues are related to the WiFi connections or are a TuringPoint server issue.
  • Student evaluations of RW are on the whole quite positive – 59.%% said they liked the concept of using voting technology in lectures and were happy to use their own device to vote with

Tips for a successful RW experience

Paul and Ceri helpfully outlined some tips for getting the best out of RW:

  1. Generate helpsheets not only for staff but also for students
  2. Set-up a website that contains links for direct access to the RW app for a variety of different mobile operating systems for students (TurningPoint has one but it is too US-centric and confusing for UK-based students)
  3. Before you start, seek assurances about the capacity of the WiFi for mobile devices in the classroom where you intend to teach. WiFi access points may indicate that they can support 128 connections, but actually they may only support half this number (i.e. 64) when connecting up mobile devices –  this is due to some technical point which I’m still struggling to get my head around but Paul said does happen and we have to watch out for this!

Using clickers to support the marking and moderating process

Fellow Brit Dr Abby Cathcart now based at the Queensland University of Technology (QUT) talked to us about her experience of teaching (and assessing) cohorts of up to 1250 students! In addition to widely differing class sizes, Abby also jokingly contrasted the average climate of Sunderland (where she worked 4 years ago – 9C) with that of Queensland (a sunny 26C !). She admitted that the first time she heard about clickers she thought they were ‘edutainment’, but after her experience she is completely convinced of their pedagogic potential, especially in terms of improving feedback to students.

Abby’s main use of the clickers was in marking teams, where for 1250 students, marking teams are typically made up of 25 tutors, all of whom have varying levels of experience and opinions. She uses the clickers to collect opinions from all 25 markers  about student work, as in the past she found that one or two voices (usually the experienced tutors) would tend to dominate when all tutors met. Marking novices would rarely speak up  if they thought their opinion would rock the boat.  Abby found that more graders participated in the moderation discussion when clickers were used, with markers commenting that they felt ‘bolder’ and ‘more confident and prepared’. The paper outlining Abby’s work can be found here

Using clickers to highlight feedback for students

Finally Dr Cathcart also talked about strategies that she uses to improve the way students engage with the feedback that is given to them. She found that nearly 45% of her students didn’t even bother to collect the feedback she had so painstakingly written for them (this is in line with national figures). She also took into account a point made by Phil Race where he said that students must receive feedback on their assessments within 48 hours otherwise there is no point in doing it! To overcome this, Abby really emphasises to students at the start of the course that giving them feedback is ‘something we do really well’ and that students can expect high quality feedback on their assessments. Abby gets her students to understand their assignments and the assessment criteria for them by showing them a sample from a real piece of student work in relation to an assessment criterion, then having the students use the clickers vote on which mark they would give it. This is followed up by group work where students discuss the mark – as Abby says this ‘springboards a social construction of how to make sense of the assessment criteria’.

Final Thoughts

I’m glad I attended this conference. I’ll admit I did think it was going to be very corporate and that TurningPoint were going to try and sell stuff to me. However I couldn’t have been more wrong and its focus was on the practical and the pedagogical in equal measure  – so highly useful and thought-provoking too. Looking forward to the next one!

%d bloggers like this: