Presenting a novel assessment and feedback portfolio for students in the School of Biological Sciences – Dr Bex Pike (Lecturer in Biological Sciences)
Lack of student engagement in feedback is well documented alongside consistent low NSS scores in assessment and feedback across the HE sector, which is concerning given that assessment and feedback both have a powerful influence on students education (Gibbs & Simpson 2004). The Assessment & Feedback Portfolio (AFP) draws together recommendations for best practice for enhancing student engagement with feedback through improving assessment and feedback literacy, enhancing accessibility of key information such as deadlines/skills development and supporting students in reflecting and acting on feedback. The AFP and related initiatives have been built with published research in mind, from work on assessment and feedback literacy, Boud’s double duty of assessment, to Naomi Winstone’s DEFT tool kit and FEATS project. This AFP is a collation of resources including interactive assessment landscapes, skills framework mapping and a feedback engagement tool. AFP was co-created by staff and undergraduate student partners, and was based upon research conducted within the School as well as that from the published literature. In order to increase visibility and student uptake of these tools in the 2021-22 academic year, AFP was integrated into the our curricula through tutorials, feedback workshops and regular feedback cafes. In this presentation, we will explain the format of AFP, our evaluation of its use by students, and how it can be adopted in other Schools across the University to support the inclusion and accessibility for students regarding assessment & feedback.
Audio feedback – A student and instructor perspective – Dr Anthi Chondrogianni (Lecturer in Economics)
According to the National Student Survey in UK universities (NSS, 2021), students regularly report being less satisfied with assessment and feedback than other elements of their studies, such as the teaching they receive, organisation and management, or other academic support. The solutions applied in theory and practice range from the narrowly administrative (e.g. classifying almost all communication as feedback) to the broadly psychological (e.g. the effect of feedback on wellbeing). In my research paper, I focus on audio feedback and how providing recordings with comments and answers can improve student experience and learning, intimacy and clarity and how it can provide a more personalised way of communication between instructors and learners. I examine the technological needs and infrastructure, the type of framework and student response to audio feedback. My results suggest that audio feedback is perceived as more compassionate. The instructor’s tone and the emphasis on certain words make it more beneficial for students and can lead to increased student engagement. Student satisfaction ranges quite highly on the amount and quality of information provided, in terms of the level of detail and further improvement. Audio feedback was found to be helpful in preparation for future assignments and effective in helping students understand the strengths and weaknesses in their coursework. Compared to written feedback, about two out of five students found feedback more personal and one out of five found it more useful. In recent years, technological improvements have made it easier for instructors to provide personalised audio feedback. Most teaching platforms, such as Blackboard, contain built-in audio players making it significantly more time-efficient for instructors to provide audio feedback. Additionally, there are significantly fewer issues related to audio quality, compatibility and accessibility compared to the past. The simplification of the recording process makes it easier for instructors with limited technological capabilities to experiment with it. To conclude, by using this alternative feedback format in my practice, I demonstrated that even if feedback is not materially different in terms of content, students perceive it as more tailored to them, providing greater personal support that leads to the creation of a more effective learning environment.
Does use of rubric in marking online assessment help reduce the marking load? Initial findings from a randomized controlled trial – Dr Rabeya Khatoon (Senior Lecturer in Economics)
The online shift of assessments during the pandemic saw some increased use of grading rubrics. Though there are claims that the use of rubrics can improve marking consistency and reduce marking load for the marker (e.g., Blackboard Help), to the best of our knowledge, its effect hasn’t been analyzed formally. The benefits of using rubrics on the marker might vary given the type of assessment, summative or formative, exams or coursework, and the type of rubric used. We might see variations depending on small or large units, as well as whether the assessment is marked by the unit director or other colleagues. To understand the effect, we design a randomized controlled trial where we assign rubrics to a group of randomly selected submissions and mark with no rubrics for another group by the same marker and keep records of the marking hour. We plan to present some preliminary findings from the analysis based on Khatoon R., Selwyn, B., Jonson, A., Rahman, M, and Samkharadze, L. (work in progress, 2022).
Reimaging the MSc Project – Professor Ian Craddock (Department of Electrical & Electronic Engineering)
Our MSc in Digital Health takes an innovative approach to the MSc Summer Project. As part of a multi-disciplinary group, students undertake an accelerated process of digital technology development, evaluation, and implementation. This provides them with experience of the key phases of the product development lifecycle: market research (phase 1), patient and public involvement (phase 2), evaluation design (phase 3), quantitative data analysis (phase 4), regulatory submission (phase 5), and post-market surveillance (phase 6). At each phase, students receive a brief containing a description of a scenario derived from real technology case studies obtained from industry and the NHS. The briefing includes a description of an open-ended task requiring creativity and research from the group.
We have developed novel processes for assessment, which involves: (1) a written group report for each phase, (2) individual contribution as assessed by the student and their peers; (3) a structured engagement assessment undertaken by teaching assistants (TAs); (4) an individual student presentation.
In this presentation we will describe our reimagined Summer Project and we will present our research evaluating the summer project.