Archive

Posts Tagged ‘research’

New Learning Spaces: Method of Evaluation

December 17, 2012 Leave a comment
Observation in evaluating learning spaces

Observation in evaluating learning spaces

This is a continuation of ideas being developed around the evaluation of learning spaces at City University London.

At City University London there is a need to better understand the effectiveness of new learning spaces that are being created. City University London is currently engaged in the redevelopment of its estate. A major part of this is the re-conceptualization of the Learning Spaces. The Learning Development Centre (LDC) is working closely with City’s Property and Facilities to ensure that pedagogical principles are considered in the redesigning of City University’s learning spaces.

Understanding the effectiveness of new learning spaces is crucial for two reasons: to evaluate the effectiveness of a space newly created and to prepare better for future learning spaces design and construction. As such, this formal evaluation will be a ‘post-occupancy evaluation’ of the space. It is this stage of the evaluation cycle that presents the greatest challenges in aligning the evaluation method with the rational and practical outcomes that drove design intent. However, it is also crucial as the formative model for a full design and evaluation process, and as a source of data for new informal and collaborative spaces (Lee, 2009 in Radcliffe et al, 2009).

At a broad level, it is important for educational developers and education researchers to better understand how lecturers and students relate to the new built environment and what this means for the exchange of knowledge. To this purpose, it is understood that efforts to develop more effective learning spaces need to be informed by the extensive research into environmental behaviour and psychology (Jamieson, 2007).

To this end, I am found that the observation method is a popular tool in evaluating new learning spaces (Radcliffe et al, 2009).  Observations are builds on the principle that for research into the use and effectiveness of the new learning spaces, it is best to observe what actually happens in the natural setting (Descomber, 2003) rather than to ask for thoughts retrospectively.

In line with the epistemology of participating observation, this study would enable the research team to participate in natural learning situations, enabling better understanding of the learning processes involved in the new spaces. The observations will take place in the natural learning spaces as the research team is interested in the effects of the environment on learning as it happens, rather than they happen under artificially created conditions. This allows the research team to record information as it happens and record critical incidents as they occur (Creswell, 2009).

The observation method has a number of characteristics which cannot be found in other education research methods and which are better suited in understanding the new learning spaces. These include:

  1. It directly records what the user does in the space, as distinct from what they say they do are their perception of the room.
  2. Observation is well matched with other research methods being applied in understanding the SLE and learning spaces. As it is more about the behaviour it complements well other research methods that rely mainly on sharing thoughts.
  3. When combined with contextual information, which will be the case here, observation can give significant insight on the effects of the learning space on behaviour.

We are continually working on developing our research methods for evaluating the learning spaces. Please do share your thoughts and experiences of evaluating learning spaces.

Shadowing Research Approach in Evaluating Learning Spaces in Higher Education: A Methodological Discussion

November 16, 2012 1 comment

Observation

Research is a major part of understanding the impact of new learning spaces at City University London. However, like all Higher Education research, methodological questions soon arise. I had a number of discussions with colleagues working on ‘Learning Spaces Evaluation’ about the best method of developing an understanding of the functionality and effectiveness of the new learning spaces at City University London. My current thinking is that an ideal method would be the ‘Shadowing Research Approach’.

Observation has an anthropology root and originated in the colonial encounter between Western people and colonized non-Western people, as Europeans tried to understand the origins of observable cultural diversity (Wikipedia, 2012). Later on, it seems that the observation method leaked in Psychology and Sociological studies. So although the observation method started as very intimate, stretching over many years, it was adopted to fit the demands of time and resources limitation in the West. This more ‘laboratory’ or ‘quasi-observation’ approach has a number of weaknesses, one of which is that it does not provide a holistic picture. Also, as pointed out to me by Kate Reader, observations can create unnatural behaviour in the learning spaces user. Furthermore, observations tend to focus on overt behaviour and does not consider the internal thinking and intentions (Denscombe, 2003).

As such, I suggested we consider taking a ‘shadowing research approach’. In essence this approach combines quasi-observation (or peer- observations which is what we had planned) with case study work. It does not have to be as intimate as the original anthropological observation (i.e. we are not going to follow the academic home!) yet it will give us a richer and holistic view of the users’ behaviour and patterns of using the technology.

So the chosen user, in this case, a staff member becomes the centre of the case study. We would employ a tool kit of research methods, such as in-class observation, out-class interviews (conversations), and usability questionnaire for a stated period of time (1 month for example). The approach would demand that we look at the staff member with a wider lens, meaning that we would seek to question and understand:

Observation camera

Observation in evaluation of new learning spaces at City University London

  • their interaction with technology generally (e.g. what technologies are present in the user’s life);
  • their teaching style (e.g. do they know about ‘blending learning’)
  • attitude and perceptions toward education technology
  • understanding of learning spaces

At the end of the one month period, we should have a better understanding of the relationship between the user and the learning space. And as we have more information on their attitudes, perceptions, and general interaction with education technology across their academic teaching, we should get over the problem of having a single snapshot of their use of the space.

The approach allows us as the researchers to identify ourselves as active participants in the construction of the learning spaces experience for the user and not to shy away from our explicit influence on the interpretation of the data.  This now permits us to interpret the relationship of the user with the learning space through our expert viewpoints.

In conclusion, there are a number of ideas to discuss further with City University’s Learning Spaces experts before implementing the ‘Shadowing Research Approach’, including:

Review: SACWG seminar, ‘The efficiency and effectiveness of assessment in challenging times’

On Thursday 24 November 2011, the Student Assessment and Classification Working Group (SACWG) hosted a one-day seminar, ‘The efficiency and effectiveness of assessment in challenging times’ at Woburn House, Tavistock Square, London. 

To open the seminar, Dr Marie Stowell (University of Worcester) set out the context for the day in her presentation ‘Efficiency and effectiveness in assessment’. She identified that one of the aims of SACWG is to explore variations in practice across the sector and how they impact differently on students, retention, and learning success, and she observed the importance of placing students at the centre of the process given the fee structure proposed for 2012 entry coupled to the implications of assessment and feedback to student satisfaction. In light of the new funding model, one particularly pertinent observation she made concerned the cost of teaching in relation to the cost of assessment: the latter is resource-heavy, particularly once one factors in elements such as formative assessment (for which quality is less assured than its summative counterpart), moderation, external examining, reassessment of failed components, and the possibility that students may be over-assessed in the first instance. She also suggested that assessment criteria may not warrant the detailed attention they are typically accorded, as students tend to take the more direct approach towards assessment of endeavouring by less formal means to uncover exactly what it is that the lecturer is expecting them to produce. These arguments may indicate that both the efficiency and effectiveness of assessment could usefully be enhanced.

The next talk, by Professor Alison Halstead (Aston University), explored how institutions have responded to the challenges of recent years, specifically, the White Paper and its implications to students and to Higher Education. She noted that the potential increase in the students’ financial burden will inevitably lead to heightened expectations concerning teaching quality, learning, and employability, in which respect assessment and feedback are currently among the most important issues. She warned that student challenges to the regulatory framework for assessment may be on the rise in the future and identified that it was imperative, in these changing times, to nurture outstanding, innovative teachers and for staff to support student learning and e-learning (including assessment). Calling for the abandonment of the rigid distinction often drawn between ‘teachers’ and ‘researchers’, she suggested that promotions should award teaching excellence on a par with research. Later sections of her presentation outlined recent initiatives at Aston, for instance, standardizing the use of Virtual Learning Environment across the institution, and introducing learning technologies such as lecture capture and electronic voting systems. Her view was that teaching-enabled practice, while it took more time upfront to implement, was worth the investment in terms of teaching quality and learning success.

A structured group discussion and question-and-answer session with the morning’s speakers ensued. One point that emerged strongly was the importance of maintaining a variety of assessments, organized in a carefully considered schedule that takes a holistic overview at programme level. The latter becomes much more difficult in degree courses that incorporate elective modules, though there are both pedagogical and satisfaction-related reasons for offering choice to students and giving them ownership of their programme pathway. Another preoccupation amongst delegates was that assessments do not become too atomized, but relate to one another even beyond the confines of the module with which they are associated; one of the more innovative solutions proposed was the possibility of assessments straddling two or more modules. The need to develop sustainable structures was also discussed (for instance, moving towards group assessment to cope with rising student numbers), as was the importance of considering (as part of change management) what the benefits of effecting the change might be; if these cannot be persuasively articulated to staff and students, the change may not be worth implementing. A final warning concerned being too driven by regulations in designing efficient and effective curricula: it may be more useful in the long term to refer obstacles presented by the regulatory framework upwards so that they can be addressed.

The seminar resumed in the afternoon with a talk from Professor Chris Rust (Oxford Brookes University) on ‘Tensions in assessment practice’, which opened by reiterating the themes of the seminar in noting that current practices are neither efficient nor effective. He discussed that students have a tendency to focus on the mark they will obtain from the assessment rather than on the educational content of their studies, and that their approach often becomes increasingly surface-level as they progress through their programme. He defended modes such as formative, self-, and peer assessment as potentially yielding more ‘authentic’ assessment, arguing that graduates should be able to evaluate themselves and their peers as an outcome of their programme, and that making greater use of these options might also free up staff resources for summative assessment. Noting that students do not warm to the notion of being assessed, he suggested that perhaps the word ‘assessment’ should not be used for formative tasks. He further observed that feedback practices might be made more efficient by strengthening the relationship between modules, such that students are encouraged to learn from feedback received in one module and to carry what they have learnt over to others. Lessening the sense of compartmentalization of individual modules would, in his view, lead to more inclusive structures albeit less flexible ones, in that standardization (for instance, in terms of the same word limit for all assessments) does not always result in appropriate assessments.

Then followed a second group workshop session, on the theme of ‘What can institutions do to mitigate tensions?’. After a structured discussion of the issues, each group reported back to the seminar as to the problems that they had identified and the possibilities for efficient or effective solutions. It would be impossible to do justice here to the vast amount of ground covered between the several contributing groups. To cite just a few examples, key tensions that were raised included giving formative assessment a greater purpose (a proposed solution being to tie formative and summative assessments together in more meaningful ways), the problem of ensuring parity when using several examiners for the same assessment task (which may be solved by grading the assessment as pass/fail only), and the evergreen question of quality of feedback versus timeliness of feedback (for which there was some discussion about feedback becoming ‘quick and dirty’). On the question of standardization of process, I took the microphone to report back on the standardized feedback proforma that had been created in liaison with the students and implemented across one programme at City University London (see this post for details), and suggested, with much support from the floor, that students should be more involved in consultation regarding matters of assessment and feedback.

Prior to the close of the seminar a final speaker, Professor Paul Hyland (Bath Spa University), provided some reflections upon the day’s discussion. Noting that assessment was a large topic with which to deal, he categorized the day’s discussion as having crystallized around four main areas: external scrutiny (ranging from students’ parents to formal regulatory bodies); administration and management; the tutors’ perspective on assessment; and the students’ perspective. He argued that discussions of  effectiveness and efficiency should always be mindful of the purpose of assessment. In his view, assessment should be concerned with measuring students’ performance and nurturing learning, whereas there exists a danger of (to put it crudely) simply setting assessments in order to get the students to do some work. In this context, a greater level of student involvement and engagement with assessment would therefore be beneficial. He also observed the need to use technology to improve existing practice, for instance, to supplement traditional modes of feedback with video and screen casts. Finally, he commented upon the importance of tutors having access to students’ feedback on previous assessments in order to understand where they are coming from and to be able to support them in their ongoing studies.

SACWG has kindly made available the presentation slideshows used by the speakers, and the comprehensive notes distilled from the two very productive group discussions (as reported back to the seminar by nominees from the groups), at the following link: http://web.anglia.ac.uk/anet/faculties/alss/sacwg.phtml.

Digital Researcher 2012

February 24, 2012 1 comment

Digital Researcher 2012.

British Library 20.02.2012.

Having been placed on the waiting list for this event I was pleased to discover three days beforehand that I had managed to squeeze in as one of the 113 participants in the Digital Researcher 2012 workshop at the British Library. There were many more virtual participants via Twitter and Facebook.

Dr Tristram Hooley, event director,  set the context for the day by proposing that, whilst digital technologies in general, and social media in particular are transforming academic life, the current  evidence suggests that researchers are not using social media to its full potential due to a lack of training and development in the use of these tools. Yet there is a growing recognition that, given the social nature of the research process, recent developments in digital technologies have much to offer the research community.

The day consisted of 4 workshops: Identifying knowledge, Creating knowledge, Quality Assuring knowledge and Dissemination of knowledge of which participants were able to attend two. A flavour of the materials, tools and topics covered can be accessed from the presentations posted in advance of the sessions. I attended Identifying knowledge and Quality Assuring knowledge, which, though very different in style, were both stimulating and informative: the first about the tools which can help to meet the challenge of information overload, the second on the pros and cons of open publishing and the associated issues of intellectual property.

The highlight of the day for me was the opportunity to reconnect with Martin Weller, Professor of Educational Technology at the Open University, who gave the keynote address to close the event. Drawing on his recent book, The Digital Scholar, itself an exemplar of the move towards open access publishing, Martin outlined how social media is impacting on many aspects of academic life, including the challenge of teaching in the attention economy, how universities adapt to a pedagogy of abundance from a pedagogy of scarcity, how digital distribution of knowledge may produce new forms of public engagement with university research.

Attending the event has given me ideas for ‘Developing the Digital Researcher’ workshops at City University London, details of which will be announced soon.

If you would like to know more then please contact me.

Neal Sumner
n.sumner@city.ac.uk

New communities, spaces and places: inspiring futures for higher education

January 25, 2012 1 comment

Each year the Society for Research into Higher Education (SRHE) holds an annual conference for its members, with a dedicated preceding conference for members who are newer HE researchers. Having helped co-ordinate Newer Researcher events over 2011, along with colleagues Patrick Baughan (from City) and Saranne Weller (from Kings College London), this was the first year that I was co-convenor for the Newer Researchers conference.

The theme for the conference was: New communities, spaces and places: inspiring futures for higher education – a theme which Patrick, Saranne and I spent an afternoon deciding on thanks in part to a creative thinking technique called synectics (helpfully facilitated by former City colleague Uma Patel). We wanted a theme which would convey postivity in a time of increasing uncertainty, especially in the HE sector. We further wanted to widen our net and appeal to researchers engaging in Learning Technology-based research.

We received over 80 abstracts for paper presentations by newer researchers in the UK and across the world. After spending the summer months sending out abstracts for review and making final decisions, we accepted a total number of 50. The conference was held over 2 days at the Celtic Manor in Newport, Wales from 6th – 7th December 2011. We were happy to learn that most of the accepted authors were able to secure funding to attend the conference, despite tightening departmental budgets.

Image

A view from the Celtic Manor resort

We were lucky enough to arrange not one but two keynote presentations from highly-regarded academics – Dr Paul Ashwin from Lancaster University, and Professor Grainne Conole from the University of Leicester.

Paul’s talk was fascinating and gave researchers an honest and frank account of what HE research is really all about. Basically – its messy! As a relatively new HE researcher myself, I thought Paul’s talk helped to desmystify the process of HE research and helped me to consider how to conceptualise it better. Paul’s talk was very well received by the other newer researchers, you can view his presentation’s slides here.

Grainne’s talk tackled a different subject and was equally fascinating. Grainne took us on a journey of learning technologies, exploring how they have evolved and how we can navigate our way through them in the future. I enjoyed Grainne’s take on e-pedagogies and she also sparked lots of interesting discussion around the use of open resources and in particular open publication – should we publish our research exclusively online via blogs? will doing so attract greater interest in our work? Or is traditional publication by submission to journals better? What are the implications of doing both? Hmmm… You can view Grainne’s presentation on slideshare here

So overall another really good conference by SRHE – looking forward to the next one!

National Centre for Social Research (NatCen)

June 10, 2011 1 comment

The Analysis of Qualitative Data.

I attended this two-day workshop to extend and refine my skills in qualitative data analysis (QDA). Facilitated by three experienced NatCen researchers in the area of applied social research, the workshop used a wide range of recent studies to introduce the range of skills required to handle large qualitative data sets. The centre has developed its own methodology – Framework – for carrying out such analysis and we learned how to apply this to a range of qualitative data.

Among the topics covered were how to develop a conceptual framework for analysis, thematic analysis, the development, use and limitations of typologies, associations and explanations in qualitative data and drawing inferences. Moving beyond the basic processes of coding data, this was not about conversational or discourse analysis (which is less relevant to our applied research context) and more about finding pathways through data. The workshop was attended by a range of PhD students and active social researchers (many from the Health sector, so doubly relevant to me!) and, as usual, not the least of the benefits of attending the workshop was the opportunity to share research experiences. The workshop was well organised and informative.

One particularly useful discovery is that the next version of Nvivo, which data analysis tool has been reported on elsewhere on this blog, will incorporate the Framework tool as part of the software. It is a testament to the robustness and reputation of NatCen research that Nvivo has sought to develop their tool to include the Framework approach to QDA in the next Nvivo software update anticipated in the autumn.

Whilst I wouldn’t claim to now be an expert on QDA I am willing to share what I have learned with other team members who may be engaged in this kind of research and have materials which we used during the workshop which you are welcome to look through.

Footnote – the venue at etc .venues was very good and close to Farringdon for away days. I brought a leaflet.

Neal

For information on the centre and their training events see

http://www.natcen.ac.uk/events-and-training/our training

For government guidance on producing quality in qualititative analysis see

http://www.civilservice.gov.uk/Assets/a_quality_framework_tcm6-7314.pdf

NVivo Training To Support Qualitative Research in Higher Education

In my hand I hold a ‘Certificate of Attendance’ for attending an ‘Induction to NVivo Training’. This is significant because I am now able to use a computer aided qualitative data analysis software to analyse a number of interview transcripts I have from a current education technology research project.

SIGNIFICANT DEVELOPMENT

Nvivo 9 image

The latest version of NVivo

Bryman (2009)  points out that one of the most significant developments in qualitative research in the last 20 years is the appearance of computer software that can assist in qualitative data analysis. NVivo is one such tool that attempts to support the analysis of word rich text by removing many of the clerical tasks associated with the manual coding and retrieving of data.

QUALITATIVE VS. QUANTITATIVE

According to Bryman (2009) there is a distinction to be made between qualitative and quantitative research: qualitative research can be construed as a research strategy that usually emphasises words in the collection and analysis of data. In contrast, quantitative research can be seen as emphasising numbers. For Creswell (2008) this understanding is based on the basic philosophical assumptions researchers bring to the study and the types of research strategies used overall in the research as well as the specific methods employed in conducting these strategies.

piles of papers by stevenbley

Flickr/stevenbley

According to QSR international, the creators of NVivo, the software is important to qualitative research as it can help researchers easily organise and analyse unstructured and non-numerical information. There is a widespread mental image of a researcher dealing with qualitative data and they are trying to navigate their way through piles of paper, sticky notes and highlighters.

NVivo-9-screenshotWell, NVivo provides a workspace to help you at every stage of your project from organizing and classifying your material, through to analysis, and then sharing and reporting.

NVivo’s workspace is designed using Microsoft user interface guidelines, so it should looks familiar and is easy to use.  NVivo handles a large number of data, including Word documents, PDFs, audio files, database tables, spreadsheets, videos and pictures.

NVivo-9-modelOnce the document is imported, the researcher can work through the information, highlight key points, allowing for quick recall or analysis later. The data can then be easily visualised and organised with NVivo ‘nodes’  and ‘models’  like virtual filing boxes they allow you to see all information on a theme summarized together and in visual form.

Qualitative research generally follows the following process:

Qualitative research process

Qualitative research process

NVivo has the potential to support the researcher throughout this process, but I think NVivo really comes in use during the ‘Interpretation of data stage’, where the software can be used to organize and manage the data making the development of concepts and theory easier and supporting the write up.

I think NVivo has the potential to effectively and efficiently support qualitative data analysis. However, it is not easy to understand the software and reading or watching self-help material online is simply not enough; one must be given a basic guide by a knowledgable user. Furthermore, qualitative research softwares like NVivo doesn’t do the thinking for you; it simply provides a sophisticated workspace that enables you to work through your information.

Access to NVivo at City University London is very limited. Essentially the researchers must purchase their own licenses, either personally or through their departments. Unlike SPSS, the student build machines do not have NVivo, so both undergraduate and post-graduate students must fund their own licenses. At the LDC, we have a number of licenses for NVivo 8, which we purchased a little while ago.

THE TRAINING

The training was by delivered by Elizabeth Wiredu  founder of Data Solutions Services (DSS) , who is a QSR approved trainer and consultant. DSS provides tailored and dedicated training and consultancy services on NVivo and medical statistics. The training provided an opportunity to engage with the software under the guidance of an expert user. The benefit of training with Elizabeth was that she is a researcher herself, who uses NVivo to analyse qualitative data gathered in the medical field.  Furthermore, it was positive that the training was designed to give time for delegates to begin analysing of our own research.

SUPPORT

There is a sizable community of researchers using Nvivo; for this reason QSR have created a Forum to enable communication. This enables researchers to ask and answer questions

Also, there are official online video tutorials to get further information should you need it after the training.

THANKS

I would like to thank the Leadership and Staff Development Unit (L&SDU) at City University London for funding my place in this very useful training.

Ajmal Sultany (Research Assistant)

Follow me on Twitter

POLL

%d bloggers like this: