I have a very short piece in City AM today arguing for the continued relevance of league tables despite the hoo hah over various well known public schools being at the bottom due to a quirk in IGCSEs not being recognised. Let me expand beyond the 160 word limit to set out more clearly what I think, and specifically to address three questions raised by sceptics of yesterday’s tables
Does it matter if Eton, Harrow et al are at the bottom of the league tables?
Not a bit. This is, as I say, self evidently a quirk caused by independent schools offering a different type of qualifications on occasion to that recognised by government in the league tables. It does not mean that we will see hordes of Etonians and Harrovians fleeing to their local state schools. Moreover, it is difficult to imagine that such a quirk will put off prospective parents; although academic strength is a key factor for parents when choosing an independent school, the reputation of such schools precedes them.
A more interesting question is whether state schools who have consciously decided to do what they believe is right for their pupils, even at the expense of league table position – for entering them for resits – will suffer as a result. The only answer must be that each school must make its decision as to what it ultimately thinks is right. That does not mean that the nudging effect of league tables is in itself a flawed approach, especially as everyone involved in the system – including the government – recognises and encourages parents and staff to consider factors wider than just league tables.
Does it matter if we can’t truly compare 2014 results to 2013 results?
Yes, a little bit. But three important caveats. Firstly, league tables can be used both ‘horizontally’ (comparing performance of particular schools or the system as a whole longitudinally) and also ‘vertically’ (comparing the performance of different schools under the same set of circumstances in any one year). The latter is not affected by the absence of the former. Secondly, such a limitation is only short term as results build year on year – though we are likely to see another shift when the exams taught under the new curriculum are assessed in 2017. And thirdly, and most importantly, accuracy is more important than consistency. The changes made to strip out low value vocational qualifications and not count retaken exams are a conscious attempt to raise standards across the system. I would rather turn my car around and know that I am going in the right direction than have the false reassurance of driving consistently in the wrong direction.
Does it matter if league tables only give a partial picture of performance?
No, in a word. Of course government and Ofsted should (and do) take into account wider context and capacity to improve when looking at individual schools, and of course parents should (and do) visit schools and bear a wide variety of factors in mind when expressing a preference. There are also a proliferation of alternative tables being created that allow for different perspectives – as discussed in a recent roundtable between Policy Exchange, Cambridge Assessment, and the Open Public Services Network. These might include wage gains from particular subjects (championed by Nicky Morgan), destination data from the school, or a truly bespoke “what is school x or y like for people like me (or my child)”.
This isn’t the same as saying however that league table data or exam scores don’t have any merit. Recent analysis of data from the Millennium Cohort Study for example confirms a long demonstrated finding in the literature that parents value the academic standards of a school when choosing it – close to 60% of secondary aged parents and over 40% of primary aged parents mentioned good exam results / academic reputation” as an important factor when choosing a school. And a study from Bristol comparing the performance of the Welsh and English systems – where the former abolished school league tables in 2001 – also concluded that “The abolition of school league tables had a negative impact on school performance in Wales, relative to England, by almost two GCSE grades per student per year on average. This effect was concentrated in the lower 75% of schools, with the poorest and lowest average ability schools falling behind the most. The results were not driven by English schools ‘gaming’ the league tables after 2001, and did not vary by the degree of competition faced by schools”.
More broadly, these three arguments are just education specific examples of the wider argument about the value of open data and accountability in public services. My firm view is that such data (augmented, of course, by intelligent scrutiny and other, softer measures and experiences of services) is a vital tool for driving improvement, particularly amongst services accessed by people who do not have the option to buy themselves better services in any way. The same arguments are raised time and again – data will be irrelevant, it will be out of date, it will be misleading, it will skew behaviours in favour of the sharp elbowed – and almost always turn out to be false. Two quick examples illustrate the point. Firstly, the reforms of Governor O’Malley in Maryland and before that as mayor of Baltimore are a world leading example of how data can be used to reduce costs, improve public services, and build public confidence. Public service absenteeism in Baltimore dropped by 50% and the city saved $350m through efficiencies under his tenure. In Maryland, job growth is strong, violent crime is at a 37 year low and infant mortality continues to fall. The state now produces an annual report card which allows for quick at a glance data on public service performance. Secondly, in England, cardiac surgeons have had mortality rates published at an individual level since 2001. The move was much criticised with fears that doctors would shy away from treating high risk patients, and that such figures were simplistic and misleading. The results, however, are conclusive. Outcomes for adult cardiac patients have significantly improved since 2001, with lower mortality rates and less need for second operations due to a sharing of best practice. Most encouragingly, contrary to sceptics, the availability of reliable data on performance has emboldened surgeons to take on more complex cases. As a result, more elderly and high risk patients are being treated, not fewer.
Of course schools are more than just data. But we all – parents, taxpayers, citizens – deserve good and full information on school performance to help us understand what they are doing.