Using student surveys to support professional development

Student surveys are one of the feedback tools available to teachers as part of the Great Teaching Toolkit. The student surveys can be used by a teacher to gain insights into their practice, and provide indicators into what they are already great at, and what could be an area of focus to be even greater.

Claire Badger, who is responsible for teaching and learning at Godolphin & Latymer School, spoke to us about how they are using the student surveys to support professional development. (Responses from student surveys are completely anonymous. In this case study, a teacher that used the student surveys has shared their results with Claire.)

 

Why did you choose to use the student surveys?

For some experienced teachers, professional development can feel like a waste of time – they’ve honed their classroom craft, built up their subject knowledge and have seen various teaching and learning fads come and go.Β  Rather than seeing the yearly review as a chance to develop their practice, such teachers may view the professional development review as a top-down accountability measure. However, I have rarely met a teacher who doesn’t care about their students or their subject – they are still in the profession for a reason.Β  And this is where the Great Teaching Toolkit and its associated student survey tools can be so powerful.

The Great Teaching Toolkit places the teacher, rather than the reviewer, at the heart of the process. They get to choose which students to survey, which aspects of their practice to survey, and they can choose whether to share results with their reviewer (or not), giving the teacher a sense of agency over their own professional development. Unlike questions that you might construct yourself, the GTT survey tools are based on the GTT Evidence Review and have been fully tested to ask questions of students on areas which the evidence suggests has the most impact on teaching and learning. Plus, it is super easy to set up the questions and send a link to students to complete anonymously. From there, the resulting feedback report is clearly presented and benchmarked against responses from thousands of students.

 

What did you learn from using the student surveys?

An experienced teacher I worked with this year was keen to understand his students’ perspectives of his teaching, so we decided to use the survey questions from Dimension 4: Activating Hard Thinking. The responses highlighted that the students didn’t feel the feedback they got from the teacher helped them to improve and, as a result, we chose this area as a focus of the review.

 

How has your colleague’s practice changed as a result of this?Β 

The teacher felt that the best way to tackle this was to incorporate more low-stakes testing, alongside feedback on student responses, into his lessons. The teacher chose to use a digital platform, Nearpod, which allows a teacher to set a variety of different question types and the anonymous sharing of student responses.

I watched a lesson where the teacher trialled this approach and could see how the students engaged with the teacher’s feedback on their answers. It also led to an interesting discussion around the pros and cons of using digital tools rather than pen and paper. The teacher is continuing to use more frequent, low-stakes testing using both digital and more traditional methods in his lessons, and we are planning on asking the students to complete the survey again in a few months’ time to see the impact on their perceptions. All in all, a really positive and uplifting experience for the both of us!

 

If you would like to speak to one of the team about how you might implement the Toolkit at your school or college, simply fill out this form, and one of the team will be in touch!

Showing 4 comments
  • Hannah Abu-Ghaida
    Reply

    I wonder how this approach could work in KS1 or EY – would pupil feedback carry the same weight or reliability?

    • Jack Deverson

      Hi Hannah! Good question. When we developed the surveys, we were able to validate them with sufficient reliability at a reading age of 5-7; that’s the lowest bracket we go down to. EY would be an interesting case to see if it’s possible, but isn’t something we have been able to crack yet. At the younger end of the scale, the questions are simplified and there are fewer, as well as the range of responses being limited. We do have several schools using them in primary, so do let us know if you’d like to know more. Will ask your question to our research team here too!

  • keith
    Reply

    I like that the students have this input. They are there to experience how useful or not the teaching environment is. It will give them democratic participation skills, make them aware of what goes into a well made school experience, and in future iterations when they are educators or Politicians they can have a better view of places to intervene in the system. it become a slowly responsive feedback loop, still faster than long data based only on grade performance levels.

    I also hope that students will make the connection between their active engagement with hard thinking and what they are able to accomplish will motivate them to put in effort when otherwise they would have thrown in the towel earlier.

    • Jack Deverson

      Thank you, Keith! That’s really lovely to hear. We hope the surveys offer a useful tool for both the teacher and the learner, as you suggest!

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.

0
X
X