We’ve just come back from an intensive programme we designed, for Aga Khan Academies in Mombasa and Hyderabad, to help them revamp and develop their use of theΒ Centre for Evaluation and Monitoring‘s assessment data. After three days on each campus, here is the rationale behind it, our approach, and a bit about the experience.
The brief
Jonathon Marsh, Global Manager for Professional Development at Aga Khan Academies, contacted us a while back to assist in their challenge of developing a group-wide approach to effective use of CEM’s assessment data.
In both Mombasa and Hyderabad, the schools had been using the data for four years, and Stuart Kime, our Director of Education, had gone in at that point to conduct some training sessions with key staff members. Since that training, the leadership team had not seen a coherent plan implemented for the data to be rolled out, shared, and used consistently well. That, then, was the challenge to us this time round: it had to beΒ more than just a training plan in this case.
As well as a honed three-day plan on each campus, including some vital sessions on data interpretation, it wasΒ also a key objective that all groups (Primary, Secondary and IB-level) in both academies should come up with a strategy for use of the data from this point on. We aimed to guide this process, and at the end of our visit, there should be ideas and concrete plans made towards a self-supporting system, both school- and network-wide.
The approach
So, with the help and guidance of Jonathon and Alex Holland (Curriculum Manager at the Academies), we divided our approach into two distinct but linked groups:
- A core group of leaders, heads of department, and other key staff in each school;
- The whole body of teachers.
For theΒ core group, the challenge was not only to learn the ins and outs of the assessment data on offer, but also to lead and support sustained change from within each school. The responsibility would fall on them to suggest and devise concrete action plans for their school, and they would do this through a guided macro-to-micro process; we created a template for this, looking distinctly and clearly at baseline, predictive and then progress data, and starting with the whole-school picture, drilling down into subject- and student-level feedback.
We talked about creation of an annual cycle for data use, and Jonathon’s request was that each stage (Primary, Secondary and IB) should come up with concrete dates and actions, which he could then look through and coordinate into a whole-group action plan. In Day 2 of this training, we also covered the basics of validity and reliability in teacher assessments, and we talked about how to go about “mapping” CEM baseline data with everyday, in-class assessments.
Secondly, then, we and the Aga Khan leadership team also realised the importance of training the whole staff well on interpretation. The core group would be there to lead and support, but without a base knowledge of how to interpret the relevant pieces of data, it would mean nothing to the remainder of the class teachers. These sessions were introduced by one of the core group, in order to link the three days’ hard thinking to real pragmatic changes in school, and we then provided a two-hour session on each of the assessments used. In these, we covered how the assessments work and what the data are, as well as what they mean and how to access them.
JD talking @CEMatDurham data with @AKANetwork Mombasa #TicklingTheBrain pic.twitter.com/T3qMe5jvfR
— Evidence Based Edu (@EvidenceInEdu) November 18, 2016
The experience
On both campuses, what we saw was hugely encouraging; we feel that the staff there have laid the foundations for really robust and positive change. That being said, we hope to see the plans implemented thoroughly in the coming years, and the success or failure of that will ultimately come down to the enthusiasm and commitment of the staff, and – of course – the necessary time being allowed for that work to be done.
The whole staff in both academies asked really intelligent questions and began understanding the power of the data. We could have well imagined – with most of them never having used the data before, and then being asked to stay after a long school day for two hours of data training – a pretty negative mindset, but they really bought in. Once they understood that the assessments are not curriculum-based, the power of the data at every level began to open up to them. Across all stages of the school, that, combined with good professional knowledge of the pupils in question, can be a really powerful driver for positive change.
The other key driver for that will be the core groups of staff on both campuses. In both places, what we saw was a fantastically open-minded, switched-on and welcoming group of teachers; they were wholly engaged, and the development of their confidence and understanding was very clear over the three days. In fact, in Mombasa, we were given Kenyan tribal names, and a new hashtag was coined, as they were particularly taken with the concept of their brains being tickled…
@EvidenceInEdu Asante! #DecisionDrivenDataDiscussions #TicklingTheBrain
— Evelyn Awino (@EbbaAwino) November 19, 2016
@EvidenceInEdu that was truly a wonderful and insightful training on the use of data and evidence to better our practices.
— Kepha obiri (@kepha_obiri) November 18, 2016
It will be interesting to see how their plans are seen through, and how they lead to really pragmatic school improvement strategies in the years to come.