So, here’s some more evidence of bad judgement by OFSTED. I’m particularly interested in primary schools judged to be Requires Improvement by OFitsHEAD because I have direct recent experience of these schools being judged badly by people who clearly know no better.
Having looked at reports published in February 2014, I’ve taken the first five reports I could find with a section 5 ‘school inspection report.’ The following schools were inspected:
West Green Primary School, William Levick Primary School, Randal Cremer Primary School, Featherstone Wood Primary School and Christ Church, Church of England Junior School, Downend, with Ofsted URNs 125826, 112541, 100236, 131505 and 109165. The schools had one day inspections on dates ranging between 16 January and 5 February this year.
The data is wrong and the judgements are unfair
All of the schools were judged to have achievement which Requires Improvement, based on data which compares the school to a national picture in an invalid and unfair way. And, naturally, they were all judged to have teaching which Requires Improvement by inspectors who had clearly made that decision before they entered each school.
These reports contain further evidence of the way Ofsted inspectors make up unsubstantiated claims to reflect their prejudices based on understanding and reading data which is not even wrong:
At West Green Primary School, Crawley, “Achievement requires improvement. Attainment at the end of Year 6 in 2013 was broadly average, with mathematics being the strongest subject. From pupils’ starting points this reflects average progress over time. Pupils make good progress over Key Stage 2, but elsewhere it is patchy and, given their starting points pupils’ progress requires further improvement to be good.”
Attainment was broadly average compared to what, exactly? The national picture is distorted heavily by, amongst other things, parents acting in what they perceive to be the best interests of their children, with the more affluent pulling many, many strings to support their children’s progress both in and out of school. This makes comparisons between small cohorts of children and any national data inherently ridiculous and unfair. And the progress measures are based on a value added measure which is statistically invalid, utterly hopeless and tells you nothing whatsoever.
At William Levick Primary School,near Sheffield, “Although information in 2011 and 2012 suggests that children make good progress, in 2013 the proportion of children with a good level of development at the end of the Reception was well below the national average. However, this information is not an accurate reflection of children’s ability.“
This last comment about the children’s ability seems to indicate that, what the inspector expected to see based on the data didn’t seem to be the case, and so an alternative hypothesis had to be concocted based on a handful of twenty minute observations of children and teachers. Either this, or the Inspector is able to judge children's ability based on a single day in a school.
The report goes on to say, “In Years 5 and 6, as a result of good teaching, pupils make good progress () especially the most able. In other year groups the same good progress is not as evident, and the work in pupils’ books does not consistently support the attainment levels given.”
The progress measures are relative, as cohorts are clearly being compared to each other or to other schools nationally. But the cohorts are groups of children, and small ones at that. This school has roughly 25 pupils in each class. Why would they all progress at similar rates?
“Some groups of pupils across the school do not make enough progress in their writing skills. Key Stage 1 pupils’ attainment in mathematics is below the standards expected nationally,” at Randal Cremer Primary School. Almost 70% of children at this school speak languages other than English at home, and two thirds have qualified for Free School Meals in the last six years. So let’s damn the hard work of the children, families and the school by comparing them to those not struggling against the odds. You simply can’t compare this school in this way.
Moving to Featherstone Wood Primary School in Stevenage, “In Years 3 to 6, too few pupils made expected or better than expected progress in mathematics which is why, in the 2013 end of Key Stage 2 tests, standards were well below the national average.” Yes, I’m sure that this is a perfectly reasonable assumption if you are an OFSTED inspector and have no idea how you are abusing bad data to make these unsubstantiated claims.
The report continues, “In some lessons, pupils are spending too much time practising skills they have already learnt, rather than being moved quickly on to the next challenge. In one lesson, for example, some high ability pupils understood perfectly how to find percentages of numbers and could explain the process clearly because they had covered this in a previous lesson. Instead of providing more challenge, the teacher asked the pupils to practise these skills further which did not help them to move onto the next steps.”
But, hang on, we’re all being told by Ofsted that we’re preparing “pupils to solve familiar and unfamiliar problems and demonstrate fluency and accuracy in recalling and using essential knowledge and mathematical methods.” So why damn the school for allowing children to do exactly that in the snippet of the lesson you saw? Do you want to see 'progress' in every lesson in a way your boss told you not to? Michael Wilshaw specifically said. "It is unrealistic, too, for inspectors to necessarily expect that all work in all lessons is always matched to the specific needs of each individual." But let's ignore the boss - because the school's data told you that pupils weren’t making ‘expected progress’ and this suits your pre-judgement of the school.
At Christ Church, Church of England Junior School, Downend, we’re back to familiar nonsense as, “Teaching requires improvement because over time it has not been good enough to help pupils make consistently good progress.” Or, the data said so, Sir Michael, and I can’t think for myself.
For comparison, and because I was getting fed up with reading the same ridiculous judgements by people who have absolutely no right to judge schools in this way, I looked at the first school which OFitsHEAD had pre-judged as Outstanding.
At Bridgewater Primary School (108466), “Pupils achieve exceptionally well in this school. Those pupils who start school in the Nursery and remain in the school until the end of Year 6 make rapid and sustained progress. Those pupils who join the school other than in Nursery class also make rapid progress.” Now, this is a school in a challenging part of Newcastle Upon Tyne, not some leafy suburb, and it’s nice to think that these children have made ‘rapid and sustained progress’ between five and eleven years of age, although how you could prove that was solely the result of their schooling and no other factors whatsoever, I have no idea.
Mind you, “While the work pupils are asked to do in writing and mathematics is very closely matched to their learning needs, in science and topic books the work they are given is not always as closely matched to their needs. Therefore, the progress they make is not as rapid as it is in writing and mathematics,” so it’s a good job schools aren’t judged on science and the extended curriculum, because Bridgewater would be getting it in the neck if they were.
You will not be in the least surprised to find that the school has data which shows achievement to be ‘Outstanding.’ It wasn’t in 2012, but it’s much better now, apparently. Oh, and the school is in the 1st quintile pretty much across the board in its similar school measures according to the OFSTED Schools Dashboard. Which shouldn’t mean anything, as the comparisons in the Data Dashboard are random, but oddly enough, they seem to reflect the Inspector’s view of the school. Or perhaps it was the other way around.
Oh, and the schools Value Added scores are all above 100, so that’s good – although the statistically invalid confidence limits suggest that three of them might actually be below 100, although I’d doubt if many inspectors really understand what these figures mean or why they are rubbish.
Still, according to the data, it all looks good, and no doubt the inspector ambled into school having every prejudice confirmed.
So where should we go next? How should we stop this ridiculous nonsense?
As the recent Watching the Watchmen report from Policy Exchange notes, “Both the data analysis undertaken for this report and responses to the call for evidence were clear that the data on Achievement is a significant driver for the overall result of the school. Indeed, the Achievement of Pupils sub grade correlates with the overall grade in around 99% of cases.” The report continues, “For schools graded as a 3 (Satisfactory/Requires Improvement) or a 4 (Inadequate), this is the strongest correlation by some way.” As anyone in education can tell you, schools are damned or lauded by data before Inspectors arrive at the school gate.
I want to make it absolutely clear, however, that when Policy Exchange suggest that OFSTED should stop observing lessons - as welcome as this would be - and inspect based on data, they are dangerously wrong to do so. The data simply cannot justify the judgements that OFSTED and others currently make about schools. As the Policy Exchange report notes, “There is more data in schools than ever before about progress and achievement of pupils.” But if the data is not even wrong, and we should not be changing the system to rely on it even more than we do currently.
Watching the Watchmen concludes “that an over-reliance of data is not without risks – particularly for smaller schools, and for schools that do not use National Curriculum levels. But on balance, the conclusion is that the data – when properly moderated, and when scrutinised by inspectors fluent in data analysis – offers much potential for assessing schools in a valid and reliable way.”
I have shown over the past few articles that this conclusion is damagingly incorrect, and that the assumptions underlying the current use of data mean that it simply is not possible for inspectors to accurately scrutinise data analysis which is fundamentally flawed on every level. If you use the nonsense RAISEonline purports to demonstrate about 'achievement' and 'progress' to 'inform' your judgement before you even enter a school, you come out with utterly unfair judgements. RUBBISH in, rubbish out.
We need to urgently rethink how schools should be evaluated, so that OFSTED stop making these unfair judgements. Using the current data model is not the way forward. We need to stop damning children and schools with achievement judgements based on spurious data alone.