Archive

Posts Tagged ‘module design’

Twitter in the University Classroom: Live-Tweeting During Lectures

January 3, 2013 3 comments

My second blog post reflecting on teaching innovations of 2012 is dedicated to my use of Twitter during one undergraduate module in the year just passed. My original intention, in embedding a Twitter widget within one of my Moodle pages, was merely to issue the occasional message to students to aid communication of, for instance, my progress with marking of their assessments. However, when I announced our ‘official’ Twitter hashtag to the students, to my surprise and delight, they started to use it not just for my module but to tweet about other areas of the programme as well. Even students not on the module started using the hashtag!

A few weeks into my module, I discovered that students who brought to class mobile devices that were connected to the wireless network (see my previous post on BYOD) had been tweeting on the lecture during the lecture, prompting me to tweet back during the break. At this point, with the help of several colleagues from the Learning Development Centre (thanks are due to Neal Sumner, Siân Lindsay, and particularly Ajmal Sultany), I investigated a means of live-tweeting during lectures without interrupting the rest of the teaching such as my use of PowerPoint and audiovisual examples.

Chris Wiley - Live-Tweeting During LecturesHaving looked into a number of different desktop-based Twitter clients to see whether they would meet my rather specific requirements, I found that Twhirl worked perfectly, with a search set up for the hashtag. I needed to increase the number of seconds for which the desktop alert is displayed, to give the students sufficient time to read it before it disappeared (I have to confess that since the alerts are only visible for c.15 seconds, a student and I had to mock up the photograph, right). I also found it necessary to lower the resolution on my laptop, because otherwise the alerts would have appeared off the far right-hand side of the screen when projected through the teaching pod.

It took a little while to get it just right, but having found workarounds for the various technological and logistical challenges, in several classes (with the aid of my trusty iPad) I provided a running Twitter feed before, during, and after the lecture, which helped keep students’ attention focussed on the key points and issues particularly when audiovisual examples were playing. A few students (though perhaps not as many as I’d hoped) followed my lead and tweeted their own thoughts too, all of which were displayed in real-time on the projector screen at the front of the classroom. We also received tweets from former students who have taken the module in the past, from staff elsewhere in the University who picked up news of the lectures via Twitter, and even, occasionally, retweets from users unknown to us – an ideal reminder that we were discussing real-life issues that have a bearing on the real world, beyond the confines of the University.

Disadvantages to live-tweeting include that the author of a given message is publicly identified rather than anonymous (perhaps this was why some students were using the hashtag only outside the classroom, rather than having their tweets appear on the projector screen during class), and that the tutor cannot anticipate the appearance or content of a tweet so there is a danger that it might interrupt the flow of the lecture. Nonetheless, although an ambitious undertaking it did seem to be an effective way of using Twitter to enhance teaching, without placing it at the centre of teaching. It also provided a novel means of engaging the students – including some who might not have been quick to contribute to face-to-face class discussion.

Were I to take Twitter back into the University classroom in the future, there are a couple of additional possibilities I might seek to implement. One is to pass a mobile device or two round the class and appoint specific students to be responsible for providing a running Twitter commentary for a given lecture. Another is to embed tweets within my PowerPoint presentation via add-in Twitter Tools, such that they are automatically posted (and the alert received) upon reaching the associated slide. Using these Twitter Tools, it is even possible to include a tweet cloud in a PowerPoint presentation, and to embed a real-time Twitter ticker feed at the bottom of each slide, which might ultimately obviate the need to use a desktop-based client. Much to think about for 2013!

Self- and Peer Assessment using Turnitin in SEMS: Cengiz Turkoglu

August 1, 2012 5 comments

Cengiz Turkoglu, a Senior Lecturer in the School of Engineering and Mathematical Sciences, principally teaches final-year undergraduate students and one of the MSc Aviation Management modules, with class sizes usually not exceeding 20 students. Each of his modules uses a similar assessment pattern comprising one coursework plus an examination. For the coursework component, he utilizes the self-review and peer review functions of Turnitin as part of the assessment.

The coursework has an initial deadline of a minimum of 6-8 weeks into the module to allow sufficient time for students to conduct research and write their essays. Once the students have submitted their paper, Turnitin’s PeerMark assignment function allows them to be either paired or randomly allocated another paper, which they are then required to peer-review. Given that there is always a range of standards represented by the students and their papers, one dilemma that Cengiz has faced concerns whether to pair the students randomly or to attempt to group them according to their standard. He never pairs them such that two students are asked to review one another’s papers.

The feedback provided by each student in peer review is subsequently made available to the original author – and the students are made aware at the time of writing that their comments will be released in this manner. At the same time, each author is asked to take a self-assessment exercise that follows exactly the same format as the peer review. As the process is conducted entirely online using Turnitin, it is completely paperless, which reduces the administrative workload and makes for a more sustainable structure.

For Cengiz, self- and peer review are only valuable if they lead somewhere in terms of the assessment process. With that in mind, once the feedback has been exchanged between students, Cengiz gives them a week to undertake further revisions to their original submission should they wish to do so. He asks that they do not rewrite their paper substantively, but confine themselves to minor amendments. Plagiarism of the peer-review feedback is not an issue because all the material is traceable and hence can be attributed. Only after the revised submission has been received does Cengiz mark the work summatively using GradeMark and provide his own feedback.

Detailed assessment criteria are provided, with the marking criteria broken down into six different categories each with their own weighting, of which one category is self- and peer review (worth 10% of the mark). The students are therefore aware from the outset that it is an integral part of the assessment, and its summative nature encourages them to engage fully with the process, since Cengiz’s experience is that students can be very assessment-driven. The questions they are asked for the self- and peer reviews correspond to the other assessment categories, so they judge each other’s paper, and their own, in exactly the same way as the examiner.

Cengiz has found this to be a very valuable exercise. It sets the students thinking about how to frame feedback, offering helpful advice to the author rather than simply giving praise or criticism. It also encourages them to consider issues such as whether the author understood the question and maintained focus, how well they researched the subject, and how coherent the arguments they presented were, based on their own reasoning or factual information they identified during their research. (The criteria matrix used by Cengiz is shown below; this is also entered as the rubric in Turnitin.) While students are variable in their engagement with the process, Cengiz notes that the best self-reviews and peer reviews recognize areas where the submission can be improved.

Turnitin screenshot - criteria matrix

Cengiz argues that the value of this assessment model is that it provides a simulation of real-life scenarios. In safety-critical industries such as aviation, for example, maintenance engineers are expected to inspect each others’ work on a regular basis, and the peer review process is widely used particularly by design engineers. In addition, all engineers should be expected to reflect upon, and to strive to improve, their own performance in order continually to develop themselves professionally. They may not necessarily always receive the most favourable advice from their own peers, so engineering students are prepared effectively for the profession through nurturing skills such as being able to evaluate the feedback they receive and to make their own judgement when taking decisions.

Cengiz justifies equalizing the weightings between the coursework and examination (originally weighted at 30% and 70% respectively) by citing the introduction of the requirements for self-assessment and peer review as a reason to give greater weighting to the coursework component. He strongly believes that examination is not the only suitable assessment method for his modules as the nature of the topics he teaches is such that they require understanding and the ability to apply this knowledge to real-life scenarios, rather than merely memorising content from text books or course notes. After studying on the Postgraduate Certificate in Academic Practice programme delivered by the Learning Development Centre at City University London, Cengiz has become an advocate of self-directed and reflective learning, and he encourages his students to become more critically self-reflexive so that they can learn from their own experiences.

If you would like to know more about this assessment model, Cengiz is happy to be contacted by e-mail: cengiz.turkoglu.1@city.ac.uk.

Christopher Wiley and Cengiz Turkoglu

Use of the Personal Response System for Formative Assessment in Optometry: Dr Byki Huntjens and Dr Steve Gruppetta

With the recent founding of the University Personal Response System (PRS) Steering Group, co-chaired by Dr Siân Lindsay and Farzana Latif, this would seem to be an opportune time to profile one of the innovative approaches implemented within the University in using PRS technology for formative assessment.

Dr Byki Huntjens and Dr Steve Gruppetta are lecturers in the Division of Optometry and Visual Science who have introduced the PRS to undergraduate students in order that they may receive immediate classroom feedback during Clinical Skills and Optics lectures. A PRS handset is given to the students (against a small deposit) throughout their degree programme, and is registered to their name to enable responses to be matched to individuals. Each lecture features a succession of multiple choice questions (MCQ). Byki’s practice is to start later lectures with a set of MCQs covering the previous topic plus the background reading for the class, and test the students’ understanding of the new topic later on during the lecture. Steve includes material that potentially encompasses the previous lecture, the current lecture, or even paves the way for a new topic to be discussed. The end result is a series of technology-enabled formative assessments.

Although only the group scores are shown during lectures and the progress of individual students is not revealed, the results of the quizzes are uploaded to Moodle each week by topic and the students are thereby able to check their individual score. This enables them to track their progress over time, and doubles as a reminder of the topics to which they need to direct particular attention prior to the examinations. The Moodle grade book also shows the students’ ranking among the whole group, leading some of them to become slightly competitive. Indeed, the element of competition is actively nurtured – the top five students with the highest marks in the year are awarded a prize at the divisional Prize Giving event.

The students have shown excitement during the PRS quizzes and appreciate the immediacy of the feedback, the anonymity of the process, and the way that it articulates the lecture by providing an interlude. Steve has developed the practice of making the PRS quizzes, which he calls the ‘Optics Challenge’, distinct from the rest of the lecture by changing the background of the slide from white to black (see screenshot below). The students’ responses are also used by the tutors to adapt subsequent lectures to the level of understanding of the specific cohort; this has prompted a change of direction on several occasions. In addition, this information has enhanced the support that the tutors are able to offer when students have sought extra help.

The Optics Challenge Leaderboard

Byki delivered a presentation on the use of PRS technology for formative assessment at the Fourth Annual ‘Learning at City’ Conference on 13 June, 1.20-2.00pm (the video is available here).

Christopher Wiley, Byki Huntjens, and Steve Gruppetta
with thanks to Siân Lindsay and Farzana Latif

A Case Study of Interim Assessment in SEMS: Mary Aylmer

Mary Aylmer is a visiting lecturer in the School of Engineering and Mathematical Sciences (SEMS), teaching the CAD part of the module CV1407 IT skills, Communication, and CAD. She has developed an assessment pattern in which students produce five pieces of CAD coursework, each of which involves completing engineering drawings. There are two interim submissions each weighted at 2% of the final module mark, two larger submissions weighted at 16% and 40%, and an end-of-module test also weighted at 40%.

The 2% weighting for the interim submissions is intended to ensure that the students’ early work on the module is taken into account in the final module mark, which helps to focus them to the task. The exercises are carefully graded and enjoyable for the students to complete; they tend to take ownership of their own learning as the assessments are designed such that they are able to determine exactly what is required of them, so they can aspire to high marks.

SEMS CAD CV1407The obvious advantage of this assessment pattern is that it ensures that the students are definitely completing their initial work on the module. This means that they are well prepared for the larger submissions: they have already accrued plenty of experience of CAD in the first few weeks through the interim submissions, and are thereby placed in a strong position to tackle the difficult drawings. In other words, it ensures that they undertake the groundwork first.

The downside to this system for the tutor is that it generates a substantial amount of marking. Mary has also noted a tendency among students to query their marks, even in the case of the 2% submissions which are unlikely to have a significant impact on their overall degree average. It can become very time-consuming to justify marks deducted, particularly with 120 students each of whom submit 5 pieces of work.

Nonetheless, the outcomes speak for themselves. By the end of the module, the students can produce good CAD drawings fairly easily; and they have indicated through their feedback that they enjoy the course, which is very encouraging. While an assessment model such as this may be time-consuming for the tutor, it is evidently worth the investment if it results in robust learning and student satisfaction.

Christopher Wiley and Mary Aylmer

Innovation in Assessment and Feedback

April 20, 2012 2 comments

My dual role as University Learning Development Associate in Assessment & Feedback and Senior Lecturer in Music has led me to run several pilot projects in my teaching this academic year (2011-12), exemplifying innovative approaches to the practices surrounding assessment and feedback. Three case studies are given below.

(1) Using wikis in Moodle to track progress on undergraduate dissertations and deliver formative feedback

Last term I set up an wiki template in Moodle to provide each of my final-year undergraduate dissertation students with a resource that both of us could access and periodically update, for the purposes of tracking progress on their dissertations and offering formative feedback on draftwork submitted.

Major Project wikiThe wiki includes pages for the project’s working title, and a separate page for each of the meetings divided into sections for the date of the meeting, a summary of what was discussed, objectives agreed for next time, and the date of the next meeting (see screenshot, right). It was developed owing to the need to help undergraduate students keep on-track in their dissertation work at a critical time in their programme, and was inspired by the Moodle wiki previously set up for the purposes of recording undergraduate Personal Development Planning (PDP) as well as the University’s use of Research And Progress for postgraduate research students.

One student has engaged with this resource to the extent that he has created several new pages to record his ongoing progress in between supervisory meetings; the nature of the wiki is such that I can review his progress at any time and add suggestions or make revisions as needed. Another student always brings her Wi-Fi enabled laptop with her so that we can make updates to the wiki during our tutorials. Whenever one of us makes and saves a change, the other can instantly see it on their screen, which demonstrates the value of using mobile devices to support student learning – particularly as this student now takes the lead at the end of each supervision in ensuring that the wiki has been fully updated.

This would seem to be a helpful way of time-managing the task of researching and writing a dissertation, not least given that it is a challenging process that final-year undergraduates may be encountering for the first time. It also provides a concise and useful reminder (for supervisor as well as student) of discussions, progress, and objectives set at each meeting, while enabling them to take ownership of their learning. This pilot will be rolled out across the entire module next year and all final-year Music students will be expected to use it; there is also much potential for initiatives of this nature to be extended to other programmes and subject areas.

(2) Curriculum design developed in dialogue with the students: elective assessment components

One innovative assessment model that I have been developing for much of this academic year involves giving students some choice as to how they wish to be assessed. Consultation with senior academic staff within and beyond the University has identified that, while such practices are more logistically complex, it should not be supposed that there is only one way to assess students against a prescribed set of learning outcomes necessarily.

After considering several possible assessment patterns which were discussed with colleagues, I settled on the following model which essentially preserves the 30:70 ratio (standard across the institution) between the minor and major assessment points:EVS graph

  • 1 Written Examination (unseen): 30 marks
  • 1 Elective Assessment: 30 marks – the student chooses ONE of the following options:
    • Written Coursework
    • Oral Presentation
    • Musical Performance accompanied by Written Documentation
  • 1 Project developed from the above Elective Assessment: 40 marks

The Examination provides a common component for all students, irrespective of the pathway they choose for the Elective Assessment. The other assessments have been specified mindful of parity with existing module assessment patterns. The benefits to students are that the initiative enables them to play to their strengths, and to influence how they wish to be assessed and how they wish their marks to be apportioned. The Elective Assessment also permits an additional opportunity for interim feedback ahead of the final Project.

My consultation with the students as to whether such an innovation would be welcomed was revealing: the graphical result (above right) of a poll conducted anonymously using EVS handsets (clickers) speaks for itself.

The focus group that comprised 12 students in my class were also consulted on several other major points of curriculum design, including the content and schedule of the lectures as well as the manner in which they will be taught, assessed, and feedback delivered. They have decided upon all of the lecture topics themselves via a Doodle poll, and have been invited to write supplementary assessment criteria using a wiki; elements of self- and peer assessment will also be included in the module. Having discussed several different forms of feedback (written, dialogic, telephone, podcast, screencast) at the focus group, 33% of students said that they would prefer written reports, while fully 50% opted for dialogic feedback – an unexpected but welcome result.

(3) Student self-assessment of in-progress writing of a research dissertation

Earlier in the year, one of my senior postgraduate research students submitted a draft of a dissertation chapter to me in the knowledge that while some sections were complete, others would need revision either because she felt that they would benefit from further work or because she had yet to complete the research (largely ethnographic, for which she is entirely dependent on the availability of her study participants) that would enable her to finalize her writing.

Since I nonetheless wanted to give her feedback on her work in progress, I formulated the idea of suggesting to the student that after a couple of weeks she should return to the draft chapter herself to reflect upon her writing, and to embed comments electronically using Microsoft Word to identify sections where she felt that further revision would be necessary and to explain why. I would then overlay my own feedback in a similar manner.

In being able to review draftwork that the student had herself annotated, I found my attention being much more effectively directed towards the parts of the chapter upon which it was most fruitful to focus. I felt that I would have made many of the same comments as the student herself, and this means of reflection also enabled the student to ask further questions of her work that I was then able to respond to, and for us to engage in a form of written dialogic feedback (see screenshot below).

The student likewise reported that she found it very useful to return to her chapter in retrospect, and particularly to document the areas she believed required additional work. This is a model of self-reflective feedback that I am now seeking to adopt for future research students.

Dissertation feedback sample

Dr Christopher Wiley
c.m.wiley@city.ac.uk
20.04.12

PREDICT Project

August 3, 2011 Leave a comment

PREDICT, which stands for ‘Promoting Realistic Engaging Discussions in Curriculum Teams) is a JISC funded project within the Institutional Approaches to Curriculum Design Programme. The project focus is to develop a new curriculum design process that is efficient, flexible, focuses on enhancing educational development and the student experience and, is supported with responsive technology to accommodate our curriculum models. It is essential that the design process takes account of our diverse stakeholders – whether learners, staff or employers.

The Project has been running for three years and we have provided information in a range of ways but would like to use our Blog which has a link to this one. Main sources of information for the PREDICT project are:

PREDICT aims to develop a new curriculum design process that is efficient and flexible and utilizes responsive technology to accommodate our curriculum models and enhance learning opportunities.

The main objectives of the project are to:
  •   Engage all stakeholders in the process
  •   Develop a curriculum design process drawing upon stakeholder experiences
  •   Use technology to support the curriculum design process
  •   Develop values and principles for curriculum design around educational  development and the student experience
  •  Complete the project with an evaluative and critical approach

We are really interested in case studies of curriculum design and review activity so please do visit our blog this year which will develop with information being added and hopefully will become a place for you to share your practice

Project Manager

Dr Pam Parker

Best practices in Moodle course design

Michelle Moore, Remote Learner @ Moodlemoot 2011

Michelle provided some invaluable tips on setting up your Moodle module.
Strongly agree with points 3 and 10!

 Top 10 tips!

  1. Don’t use more than 3 font styles. It increases cognitive load for your learners. Students spend to much time processing info.
  2. Maintain consistency.
  3. Don’t use course page for your content, use it as a launchpad..
  4. Make sure you can see one complete topic on the screen and no more.
  5. Don’t be the one doing all the work, create a question creator role, let students create the quiz questions. Let the students collaborate and participate.
  6. Remember the value of logs, a link from a label cannot be shown in reports. Use labels to guide students!
  7. Don’t force constant scrolling
  8. You can wrap resources around images in sections, they don’t have to be huge and take over the page. Consider the different screen sizes students will be looking at.
  9. Build content in books and lessons.
  10. Try out new tools, vary it.
%d bloggers like this: