Assessment at Cottenham Primary School
[This blog builds on previous ones about Cottenham Primary School – in particular this one about their curriculum journey. We’re very grateful to James Kilsby for taking to time to follow up with an explanation of how they’re rethinking assessment at CPS.]
Assuming that you’ve managed to undertake the not so insignificant task of defining the knowledge that you are going to teach, and you’ve unpicked your way through the theory and operational challenges of intrinsic, extraneous and germane load – and everything else in-between – your curriculum vision should now be a living and breathing entity. The bad news, however, is that things are now about to get really difficult, because how do you know that it is the curriculum that is working, and not some hitherto unidentified factor, or worse, that it is actually the emperor’s new clothes…?
Meeting the challenge of being able to talk with authenticity and authority about the efficacy of the curriculum has been our strategic focus for the past year. It has also been, for me, the biggest challenge so far of designing and developing a knowledge rich curriculum.
However, like every other facet of curriculum design and delivery, it has required us to read, think, and discuss. And again, like the other challenges, whilst it sometimes feels insurmountable and overwhelming; actually, if you persevere, focus your research on people and groups who you suspect know what they are talking about, and stay true to your principles and fundamental values, then you will eventually make progress.
Rule 101 for us at CPS is that there are no ‘bolt-ons’ to our curriculum. It is every single planned interaction and experience over the course of a child’s seven years at the school.
To deliver on this, we have produced ‘The First Principles of The Curriculum at CPS’ document, which defines our approach to content, delivery and assessment.
The First Principle of Assessment at CPS states:
We will have clear definitions of what success for children within a domain looks like. We will ensure all teachers are well positioned to answer the question: have the children learned what we have taught them? To answer this, and to help us to understand the efficacy of the CPS curriculum, we will use a range of indicators, which will provide both the quantities and qualities that our curriculum yields.
Crucially, the teachers, who work in pairs alongside a senior leader in Curriculum Coordination Teams, have defined what success will look like within their subject for a pupil at the end of Year 6. We call these our ‘Intended Outcome Statements’ and they consist of around half a dozen bullet points for both the substantive and disciplinary knowledge that the Curriculum Coordinators have decided are the most important. There are no other details: no milestones or descriptors. It is all about knowing what we are aiming for, and then using our professional judgement to make accurate assessments about where individuals, groups, whole cohorts, and indeed, the whole school are on that trajectory.
This ‘range of indicators’ includes the information available at point of delivery through the pupils’ responses and the learning journeys evident within a child’s work. These reflect the school’s emphasis on developing the frequency, volume and sophistication of a child’s written work; enabling them to write with purpose and to be cognisant of the conventions for specific domains; and increasingly able to draw conclusions, make links and synthesize explanations. Consequently, direct comparisons of pieces of a child’s writing from across the year should provide a powerful source of evidence for the progress they have made. We also use the outcomes of any MCQs and other tests that the children may take during the course of a specific module. These will seek to test the amount of substantive knowledge a child has retained by the end of the module. Outcomes may be used diagnostically, to help identify areas where children have either successfully retained understanding, or where they have struggled; and to potentially compare the performance of individual children and groups of pupils, where appropriate.
We also use the outcomes of any national statutory assessments, as well as in-school pupil performance information for reading (PIRA) and mathematics (PUMA), phonics (via the RWI assessments) and the teacher assessment of writing.
So far, so straightforward…
But we needed a methodology that enabled us to take this broad range of qualitative and quantitative information, and to shape it into an accurate and reliable narrative; whilst all the time making it a tool that teachers believe in, and which does not set unreasonable expectations in order to prop up a dubious assessment system, which may risk fast tracking them to burn out.
The missing link came from an unexpected source. When discussing the challenge with a group of governors, one of them asked if I was aware of ‘realist evaluation’, which I wasn’t. The resulting explanation, and accompanying academic paper on the method, led to the Context Mechanism Outcome (CMO) model that we are now developing at CPS.
In this model, our Curriculum Coordinators posit a hypothesis about the performance of pupils within their domain. This is called the ‘Programme Theory’. They then define the class/year/group that they are going to test the theory against. This is the Context. The Mechanism is the teaching sequence, or specific resource or tactic that has been delivered. They then identify the evidence base (any of the suite of indicators from above, as well as pupils’ views, teachers’ planning, external reports, etc.) and these will be used to define the Outcome, which will include will answer a range of questions, including: was the Programme Theory correct or not? What have you measured or what findings have you made? And, finally, what recommendations do you have? All of this will be presented in a report which will be no longer than one side of A4.
It is very important to say that this is still in its infancy, and we are currently working through the initial trials. Needless to say, it has required a lot of professional development, including a heavy emphasis on modelling and scaffolding what is required at all phases. We also recognise that there is unlikely to be any external help with this, because, as far as I’m aware, no other schools currently use this model. We view this as a source of professional pride, and that the hard work will be worth it, as it enables a certain professional liberation from not having to slavishly interpret banks of descriptors or analyse endless test results.
The deadline for the first tranche of CMOs is towards the end of this term, and we will then be in a position to check the quality and learn from any shortcomings ahead of the next iteration. Already, we are giving serious consideration to having a whole-school Programme Theory that will be interrogated across all domains (possibly focused on the role of retrieval practice in addressing the issue of gaps in pupils’ prior knowledge).
Hopefully the work will pay off, and will have a reliable mechanism that not only provides evidence about the quality of our curriculum, but, more importantly, is a powerful driver for continued school improvement at CPS.
The views expressed here do not necessarily reflect those of PTE or its employees.