This week the DFE’s Similar Schools Comparison Report came out; an AI‑powered attempt to match schools with supposedly comparable peers and then benchmark attendance, persistent absence, year‑group performance… otherwise know as – the usual dashboard of Things That Will Improve If We Look At Enough Charts… made me really despair. At first I was excited, I could finally find those schools that have similar challenges to us and enlightenment would follow! Very quickly I just found myself looking at their websites shouting, “They are nothing like us at all!?!”
Now, in principle, comparing “like with like” is a perfectly sensible idea. Because surely no one would expect a tiny rural infant school to have the same profile as a sprawling coastal primary with deep deprivation and a transient population… Oh? Well, nobody except Ofsted – but more on that later… But the devil, as always, is in the definition of similar; and that’s where this whole exercise starts to wobble.
Context is Key…
Our school has a 55‑place Specialist Unit, embedded fully within our mainstream structure. That’s fifty‑five children with complex SEND needs, supported by specialist staff, working across a highly adaptive curriculum and environment. It shapes everything; staffing, attendance patterns, curriculum design, safeguarding, daily operational rhythms, you name it.
So imagine my surprise when I checked the “similar schools” list and discovered that:
Not one of the comparison schools has anything close to our Specialist Unit. The largest equivalent provision in the matched group? 12 places.
Twelve???
We have nearly five times that number.
This isn’t “similar.” This is “shares a vague feature in the database.”
And this matters, because attendance for many of the children in our unit is influenced by factors completely outside the usual models; medical appointments, therapies, highly specific needs, complex health profiles. You cannot meaningfully compare these patterns to a school without such a large cohort.
It’s like comparing a family car to a minibus and concluding that the minibus is “underperforming” because it uses more fuel per mile. It makes no sense.
Here’s the Twist… We Still Come Out Looking Rather Good
Despite the structural mismatch, the report places us 7th out of 21 schools for attendance and 9th out of 21 for persistent absence; above the median in both cases.
Even in a comparison model that doesn’t account for something as glaring as a 55‑place unit… we still hold up. But of course, it then says – Your SEND attendance at 92.5% is in the lower quartile. Well, firstly, Special School attendance according to the DFEs own data this week is currently – 86.65%. My mainstream SEND attendance is actually 94.5% according to our MIS so we can see the impact our Special Unit has (And trust me – right now in the Accountability saloon that 2% difference really matters). If we are to develop more Inclusive schools then the validity of this data matters because when schools are under scrutiny by Ofsted this is what they will be using as their first point of reference and when it is wrong the inspection will become a backs against the wall experience for many schools.
Therefore, the report just feels unfair and misrepresents our whole school community.
The Bigger Problem: Context-Free Accountability
To widen the lens, take the graphic I’ve added below (FSM6 vs Ofsted outcomes). It shows something the sector has known, and ignored, for years:

The higher the proportion of pupils eligible for Free School Meals (FSM6), the worse the Ofsted outcomes tend to be. This has been going on for decades.
And not by a whisker, but by an absolute canyon.
Look at the far ends:
- “Very High FSM6” schools have the lowest proportion of “strong” or “exceptional” outcomes and the highest proportion receiving “needs attention” or “urgent attention”.
- “Very Low FSM6” schools enjoy a rainbow of greens and blues — “expected,” “strong,” “exceptional.”
It doesn’t take a statistician to spot the pattern, because it isn’t a pattern; it’s a structural relationship that asks massive questions about societal challenges far beyond schools.
Yet here we are, again, watching a national system push out another shiny comparative model that glosses over complexity, ignores context, and pretends that differences in need, deprivation, SEND, health profiles and local challenges can be smoothed away by saying… We are going to be TOUGH on this!.
Why This Matters
Schools don’t operate in controlled laboratory conditions. They are living ecosystems; shaped by children, families, health, politics, geography, trauma, opportunity, local services, multi‑agency capacity, transport, housing, and about nine thousand other things not captured in a dataset.
When comparisons ignore these things the following seems to happen and no one really questions it:
- high‑need schools get unfairly labelled,
- low‑need schools get praised,
- and the profession gets fed another narrative of “If they can do it, why can’t you?”,
even when the “they” and the “you” are categorically incomparable.
We deserve better than that. This week Ofsted came out with the line, “We will never succumb to the quiet curse of low expectations...” As if schools who underperform are all embroiled in Witchcraft.
More than Numbers
None of this diminishes the work our staff and families put in every day. In fact, it highlights it. That we perform strongly even within a mismatched comparison suggests that our systems, relationships, and culture are doing a lot of heavy lifting.
But – it is NOT fair or equitable.
Successes should be recognised through the lens of comparable judgements (Properly researched and evidenced comparisons!). I am all for seeing how my school compares alongside other schools with 42% SEND in Y6. And, I would genuinely be up for visiting those that do better. 100%. Reading a report, that has inaccurate data and compares a school to other schools that are ‘nothing’ like them – that’s shoddy work at best.
A Final Thought
If the DfE genuinely wants to support schools, the first step is simple:
Acknowledge the complexity. Stop sanding off the edges to make the data fit. These limited understandings do not help at all. In fact, I would say – open the can of worms and let us see the full complexity even if that creates other problems – which I am sure it will.
Because schools with extensive specialist provision, high‑need cohorts, deep community responsibilities, rural or costal deprivation — are not outliers. We are part of the landscape of modern education in this country. And any model that can’t see us clearly cannot hope to help us effectively.
Leave a comment