photo of people using laptop
Case Studies, Designed for All, Group Assessment

Peer-to-peer learning and support: Two-stage assessments

The Practice

The EPSRC Centre for Doctoral Training (CDT) in Aerosol Science is a seven-institution CDT led by the University of Bristol, spanning the Universities of Bath, Cambridge, Hertfordshire, Leeds, and Manchester, and Imperial College London. The CDT offers a cohort-based, four-year training and research PhD programme, during which all postgraduate researchers (PGRs) undertake a taught first year based together at Bristol before moving to PhD research at their “home” institutions. Since 2019 we have recruited 96 PGRs to the programme across five cohorts; ~30% of these are international students (in line with EPSRC funding rules), and ~25% are returners to education.

Aerosol Science is highly multidisciplinary and our PGRs have studied a broad range of undergraduate disciplines that span the physical sciences, engineering, life and medical sciences, earth and environmental science, and pharmaceutical science (Figure 1.). This presents unique challenges in delivering a training programme, as PGRs come with a wide variety of prior disciplinary knowledge and experience. We therefore seek to use teaching and assessment methods which make this diversity an advantage by maximising structured opportunities for peer-to-peer learning and support. An example of this is our use of ‘two-stage assessments’[1] in the units Core Aerosol Science I and Core Aerosol Science II, both compulsory 30 credit point units taught via Team-Based Learning (TBL)[2], each involving eight topics per unit.

Figure 1. Undergraduate disciplines of cohort 1-4 PGRs (n=73).

In a two-stage assessment, learners undertake an individual assessment which is followed immediately afterwards by a collaborative or ‘team’ assessment using the same questions. The method is an example of assessment ofas and forlearning: It provides a mechanism for both assessing individual learner knowledge and providing an opportunity to increase that knowledge via the team assessment component. Two-stage assessments have been shown to improve learning compared with individual testing alone[3] and to foster community building within groups[4]. This approach makes an asset of the diversity of our PGRs in the classroom, as well as aligning with the cohort-based nature of the training programme. It has the additional advantage of reducing testing anxiety, helping to support student wellbeing[1].

After each of the 16 Core Aerosol Science unit topics, our PGRs complete an individual, online multiple-choice assessment consisting of 20 questions for which they are given one hour. Immediately afterwards they sit the same assessment online in multidisciplinary teams, this time with a time limit of 40 minutes. A PGR’s mark in the individual assessment is withheld until the team assessment is completed. Teams comprise 4 – 6 PGRs, are allocated by an instructor to ensure each team contains a diversity of expertise, and are fixed for the year. In total PGRs work in their teams for over 200 hours during their initial training year, including both the in-class TBL training and the two-stage assessments. 

In the team assessment, each team must discuss and agree an answer to each multiple-choice question. They submit this answer using online software and receive immediate feedback: A green tick to show a correct answer or a red cross to show an incorrect one. When a team selects an incorrect answer, they must continue discussing and agreeing on answer options to select until the correct one is identified. In this way they receive ongoing and immediate feedback on the quality of their understanding.

In terms of scoring, PGRs receive 5 points for each correctly answered question in the individual assessment. In the team assessment (where each question has four answer options), the team scores 5 points for each answer correctly identified on the first attempt, 3 points for each answer identified on the second attempt, and 1 point for each answer identified on the third attempt. No points are awarded for answers identified once all other answer options have been eliminated. A PGR’s summative mark for each Core Aerosol Science topic is calculated using a weighting of 85% for their individual mark and 15% for their team mark. The team mark can only improve a learner’s score, and so if a PGR were to score more highly in the individual assessment than their team does in the team assessment, their summative mark would be equal to their individual mark.

Successfully implementing two-stage assessments involves practical considerations, securing of learner buy-in for the method, and providing appropriate support for the teams. To address practical considerations, we use the bespoke online TBL software “InteDashboard” to deliver our two-stage assessments[1].  We have previously used a combination of Blackboard (for the individual multiple-choice assessment) and Immediate Feedback Assessment Technique (IF-AT) scratch cards (for the team assessment)[2] which also works well.

We seek to secure learner buy-in for the method during our induction activities at the start of the academic year. We explain the way in which our graduate competencies, teaching approaches and assessment strategies have been aligned, and the research evidence we have drawn on in designing their PhD training environment. We also provide opportunities for the PGRs to raise concerns and allow time to respond to these.

We employ a range of strategies to support the teams to function effectively. We foster the development of team identity by tasking our PGR teams with creating a team name and coat of arms, as well as agreeing ground rules for how their team will operate. Our PGRs are introduced to Tuckman’s stages of team formation[5] and communication frameworks such as Nonviolent Communication[6]. Our PGRs also complete two-stage assessment cycles on topics unrelated to aerosol science so that they can experience the method in a low-stakes environment. An opportunity for PGRs to participate in formative team feedback and self-reflection takes place after six assessment cycles.

The Impact

The impact of two-stage assessments was measured through analysis of individual and team assessment scores, and anonymous feedback surveys of our PGRs. Both indicated a positive picture, with the two-stage assessments providing an opportunity to improve PGR knowledge, facilitate peer-to-peer learning, build a sense of community, and support student wellbeing.

Individual and team assessment scores

We examined individual and team assessment scores for PGRs in the first four cohorts of the CDT, totalling 73 PGRs who between them sat 1157 individual assessments and 240 team assessments. The marks in all 240 team assessments were compared with the mark achieved by the highest scoring member in each team for the corresponding individual assessment. On 167 occasions (70% of assessment cycles) a team scored more highly than its highest scoring individual member, with an average increase in the team mark of 8% from the highest scoring individual mark (Figure 2). In 92% of the 240 two-stage assessment cycles teams did as well as, or better than, their highest scoring individual member.

Figure 2. Relationship between the team assessment mark and the mark of the highest scoring individual team member for the 240 two-stage assessments completed by CDT PGRs.

The data demonstrate the impact that collaborative assessment can have on the level of knowledge of the team; it is not the case that the highest scoring team members provide answers to those who scored less well without further discussion, in which case the team score would not exceed the highest individual score. An average increase of 8% in the team mark compared to the highest individual mark represents an improvement in the number of questions answered correctly by the highest scoring team member of one to two out of 20 (5 to 10%). 

PGR feedback surveys

An anonymous feedback survey exploring the experiences of two-stage assessments was completed by 27 PGRs spanning three cohorts. The survey asked participants to describe their experience of the two-stage assessments, as well as explaining how their team usually arrived at their chosen answers during the team assessment. The questions aimed to gauge how the two-stage assessments were received by the PGRs, and which strategies were used during their team discussions.

PGR comments on their experience of the two-stage assessments were sorted into five categories depending on their content. Comments were classified as either: Exclusively positive; Positive and negative aspects; Neutral; Exclusively negative; or Not relevant/meaning unclear. Figure 3 shows the distribution of the responses: 64% of comments were exclusively positive, while 77% included at least some positive comments. Where PGRs did report some negative aspects, these related to the number of multiple-choice tests they had to complete (recall that each test had to be completed twice), as well as difficulties in the team assessment when some group members were absent – this was particularly challenging when Covid-19 restrictions were operating.  

Figure 3. Classification of learner experience of two-stage assessments based on feedback content.

PGR comments suggested that our aims when implementing two-stage assessments were being met. For example, we aimed to improve student knowledge (‘Extremely useful for working through questions that weren’t understood’), to facilitate peer-to-peer learning (‘A good way to reflect on what I have learned and to consolidate the knowledge with the team members’), to build community (‘It lets me feel at home! It’s a very very nice experience...), and to support student wellbeing (‘I think it takes away a lot of the stress of usual examination techniques and is more enjoyable).

PGRs reported using approaches likely to be effective for learning during the team component of the two-stage assessments, such as team discussion, combining voting processes with discussion, and deferring to expertise. 

Next Steps

Based on the positive impact of two-stage assessments on PGR learning and the positive feedback from our PGR cohorts, we intend to continue using two-stage assessments in our course delivery. To address the issue regarding the number of assessments reported by a minority of PGRs, we plan to reduce the overall number of assessments by 12.5% from academic year 2024-25. 

We believe that this is the first time that two-stage assessments have been used at the doctoral level. 

Contact

If you would like to know more about this case study, please contact Dr. Rachael Miles, Course Director for the CDT in Aerosol Science (Rachael.Miles@bristol.ac.uk) or Dr. Kerry Knox, Science Education Specialist with the CDT (Kerry.Knox@york.ac.uk). Details of further educational research being conducted within the CDT are available on our website

References

[1] Zipp, J. F. (2007). Learning by exams: The impact of two-stage cooperative tests. Teaching Sociology, 35, 62-76. 

[2] Michaelsen, L. K., Knight, A. B., & Fink, L. D. (2004). Team-Based Learning: A transformative use of Small Groups in College Teaching. Sterling, VA, USA: Stylus Publishing.

[3] Gilley, B. H., & Clarkston, B. (2014). Collaborative Testing: Evidence of Learning in a Controlled In-Class Study of Undergraduate Students. Journal of College Science Teaching43(3), 83–91. http://www.jstor.org/stable/43632038.

[4] Sandahl, S. S. (2010). Collaborative testing as a learning strategy in nursing education. Nursing Education Perspectives, 31(3), 142-147.

[5] B. W. Tuckmann (1965). Developmental Sequence in Small Groups. Psychological Bulletin, 63(6), 384-399.

[6] Rosenberg, M. B. (2015). Nonviolent Communication: A Language of Life: Life-Changing Tools for Healthy Relationships, ‎ PuddleDancer Press


[1] InteDashboard – All-in-one Team-based Learning Platform

[2] Immediate Feedback Assessment Technique (IF-AT) Forms (cognalearn.com)

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.