The student surveys in the Great Teaching Toolkit can be used to identify an area of focus. David Jones, Assistant Head of Wallington County Grammar School, spoke to us about how they have been using student surveys to inform professional development choices.
Why did you decide to use the student surveys?
For years we have been interested in the power of student voice. We ran our own self-made surveys via Google Docs, but we were finding that, over the years, the efficacy of them really dropped off because it was very easy to make excuses for things you didn’t really like. There was no option to have any kind of external validation.
The fact that the Great Teaching Toolkit surveys have a model, which pulls together all of the best research – rather than just thinking that something is important – provides a framework for best practice in terms of what constitutes great teaching. I think that gives it a lot of validity and makes it, in a nice way, harder to hide from the results.
Further to this, I think the feature which makes a huge difference is the aggregating of survey scores against all student survey data from all teachers that have used the student surveys. From a personal perspective, you can look at your results and think ‘oh yeah I’m great at that’ but then you can see across the aggregated data that everyone is great at that – you should be great at that. But looking at another area, which previously I’m looking at and thinking that I’ve got really low scores on and need to work on – I can contextualise the data and see that it is low in comparison to my other scores, but across the whole GTT community I’m actually doing really well in this area. Ultimately, the aggregation allows you to target your work more accurately than just looking at the numbers without any context.
So accuracy and validity were two important factors in choosing the student surveys – people can buy into it because they can see how well researched it is.
How have you used the student surveys?
The starting point for us was making it really clear to staff that this is completely anonymous. We consciously divorced it from performance management. We did time it so that staff could use it to inform performance management, and we encouraged them to do so, but we made it very clear that the two were not aligned. We made sure that staff knew it was completely in their hands, which is why we think staff buy in was high.
Then, we wanted to give people something proactive that they could do – we suggested that colleagues identify, from the surveys, the area that they most wanted to work on. They didn’t have to choose, for example, 4.6 because they were not great at it. They might actually choose to work on creating a climate of high expectations, not because it was their lowest score, but they were surprised by that score and that is something they wanted to work on. Using the platform, we were then able to put colleagues into relevant groups and we allocated a certain amount of CPD time for them to work on that area.
Staff then had the autonomy to choose how they worked on that area of focus; it could either be the courses, conversations with colleagues, the video observation and feedback tools, but everyone was able to log what they had done on the system, which is a really easy, nice way, to monitor how well colleagues were engaging with their learning.
If you would like to speak to one of the team about how you might implement the Toolkit at your school or college, simply fill out this form, and one of the team will be in touch!