Giving young people a voice in teaching and learning through student surveys
The Great Teaching Toolkit (GTT) is more than just online courses – the Toolkit also contains student surveys, which help teachers identify a specific aspect of practice that they can productively focus on as part of their professional development; all linked to the Elements of the Model for Great Teaching.
We began developing the secondary school-age version of our student surveys shortly after the GTT: Evidence Review was published, working with schools in New Zealand and the UK—Prof Rob Coe has already spoken about this in some detail here.
After gathering enough data to validate question items from each Dimension of the Model for Great Teaching, and reject others, we turned our sights to adapting these question items for younger pupils—it was important for us to develop a format for the student surveys which would be accessible to a younger audience.
This phase of student survey development allowed us to work closely with school leaders and classroom practitioners in the North East of England, where we are based. (If you are thinking, ‘This sounds like something for us!’ please get in touch!).
We anticipated that as we worked our way down to younger age groups, the difficulty of crafting appropriate student survey items would increase, and this has indeed turned out to be true. However, we were pleasantly surprised to learn that our ‘secondary’ level surveys were also suitable for older primary pupils.
Before the end of 20/21 Academic Year, we conducted two focus groups with Year 5 and Year 6 pupils in St Aloysius Catholic Primary School and St Joseph’s Catholic Primary School, both in South Tyneside. We were lucky enough to speak to some articulate and thoughtful young people, who generously agreed to miss a maths test to help us. (Don’t worry, we were assured they’d catch up on the test later!) We asked them to describe examples which might illustrate each question item; we also asked them to define some of the trickier words, and suggest alternatives where we thought it could improve accessibility.
It was interesting to hear their thoughts about what ‘different’ means in the Dimension 2 question item: ‘My teacher understands and supports pupils who are different’. Contrary to our expectations, the pupils were able to give several examples of ways in which a peer may be different – suggesting academic ability, how much help an individual needs to be able to learn, home background, appearance, culture and more. In short, they considered exactly the same things we were trying to get at with that word!
These focus group sessions also made it clear that primary school pupils were more concerned about their teacher’s feelings, and giving the ‘right’ answer to the questions. It was eye-opening to hear them readily admit their teacher’s presence in the room would affect the answers they gave, and because we know it won’t always be possible for another member of staff to administer the surveys, we’ve incorporated more reassurance on this point into our survey introduction.
‘We’re listening and learning a lot’
Soon after these focus groups, we had our first chance to observe the student surveys being taken by pupils, this time at St Godric’s Catholic Primary School in County Durham.
We knew we wanted to embed a video recording of the question items into our primary version of the student survey, but we didn’t want to base decisions around the timing and pace of reading on a hunch, so we tried it out with another group of enthusiastic Year 5s and 6s. One of us read the items aloud and had the stopwatch at the ready; later, we were able to calculate an average time per item. This visit was certainly a milestone for those of us at EBE working most intensely on the student survey development.
More recently, we’ve returned to St Aloysius—to the Infant School this time—where we’ve had the good fortune to speak with some incredibly well-mannered and curious Year 1 and Year 2 pupils and trial a further simplified and shortened version of the survey. Instead of offering five options ranging ‘Agree Strongly’ to ‘Disagree Strongly’, we narrowed it to three—and added illustrative emojis for further clarity. Research suggests that limiting the options in a survey for younger children may produce more reliable results (Bell, 2007).
Those primary school teachers reading may be unsurprised to hear that there was lots of peeking at neighbours’ work, but thankfully we don’t think this poses too much of an issue for the validity of the survey results.
The truth is that we still don’t know for certain the extent to which it will be possible to create valid measures of the important elements of teaching quality for Early Years settings. The best chance we have of being able to achieve this is by working with teachers who stand in front of four and five year olds every single day. Student survey development is still a work in progress, but we’re listening and learning a lot along the way.
Bell, A. (2007). Designing and testing questionnaires for children. Journal of Research in Nursing, 12 (5), 461-469. [Available at https://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.992.7613&rep=rep1&type=pdf]