Comparing the councils: is it even possible?

Last week, I wrote about a claim made in the West Berkshire Conservatives’ (WBC) manifesto that WBC was the “best performing” authority in England. This was inferred from a research spreadsheet provided by, which produced in June 2022 a report about the value for money residents for from their council tax. The spreadsheet was only sent to me after I’d written last week’s piece though could have easily been sent to me before by the WBC Conservative Councillor who had originally sent out the press release about the claim.

In order to understand what’s going on here we need to look at thee things: the report, and what it tells us (and what it does not); the spreadsheet which, not having been published, cannot be regarded as a source; and what other information might be looked to make the results more useful.

The report

The report had a limited and specific objective. As the title says, it considers “the UK regions getting the best return on their council tax.” To do this, it looked at six areas: the percentage of roads that required repairs, the percentage of household waste that was recycled, crimes per 10,000 people, the average response time of the fire service, the percentage of schools rated good or outstanding and the percentage of care homes rated good or outstanding. Using data scraped from national websites it accorded ranks to each, These were then averaged to produce an intermediate overall performance score (we’ll come back to that in a moment) and the results then weighted by the amount of council tax the district charged. By this measure, West Berkshire came eighth out of 304: a very creditable result.

These six criteria don’t represent all that these councils do but I can see three reasons why they were chosen. They were measurable according to national standards; they covered statutory obligations; and they covered major areas of expenditure. These thus provided a consistent measure of these very limited objectives.

None the less, it’s worth pointing out how broad-brush this approach is. Other criteria could have been adopted, though the figures might have been difficult or impossible to find. The percentage of roads that required repairs could be less useful a guide to a council’s performance than how quickly any reported problems were fixed. The crimes could do with being defined and perhaps the serious ones would have been more useful. Questions have been raised about both the frequency and in some cases the nature of school and care-home assessments, which could be smoothed by ignoring all those which hadn’t taken place in the last five years. Social care is also about a lot more than just people in care homes. In all cases, the results could have been weighted by looking at improvement and providing a figure as a change from a previous period.

The report did the best it could with the information that was easily available. To have followed any of the suggestions above (or below) would have caused much more work. The methodology and sources were clearly stated. I therefore have no complaint about the report, bearing in mind these limitations, nor with the conclusions it reached.

The spreadsheet

We now turn to the spreadsheet. This was not published by so I’m not going to either. The intermediate overall performance score places WBC top and it was this claim that was seized upon in the manifesto. To say that it was the best performing council in the country, however, was misleading, for three main reasons.

  • The Conservatives’ manifesto said that “in a recent survey of 304 local authorities across the country, West Berkshire was seen as being in the top ten for value for money and the best for overall performance.” This implies two separate conclusions, whereas there was only one. I asked the above-mentioned Conservative spokesperson about this and was told that “the best overall claim was removed from the report, unfortunately.” It’s now clear that this is not the case and that it had never been part of it.
  • The “overall performance” score was only an intermediate ranking. Were any measure of “best overall” to have been intended, any reputable report would have looked at a range of other factors including the time taken to respond to emails or answer calls, the number of upheld complaints, the progress towards net-zero, communications, transparency, inclusivity, the length of time taken to determine planning applications, budget deficit, the ethical standards of investment – the list could go on and on. Most of these cannot easily be measured or compared. didn’t look at these and so no comparative judgement is possible. Any consideration of “best overall” is pointless if only the six measurable factors, and none of the others, are considered.
  • The results needed to be weighted by the amount of council tax that was charged. These are meaningless if this aspect is not included, as in many (though, as I shall explain, not all) cases, it’s reasonable to expect that the more council tax a district charges, the better its services should be. This estimate of the bang for your council-tax buck was all the report was aiming for. WBC’s 2022-23 council tax is the highest of those in the top ten and it was marked down to eighth accordingly. This is still a very commendable score: though again I must stress that the report only looked at six very specific areas.

it’s worth also having a look at some aspects of the spreadsheet that go beyond what the report intended to express but which none the less exist. I’m surprised that, considering how long the WBC Conservatives have been in possession of this, they did not bother to look beyond the crude and misleading  “best overall” figure and see the more nuanced but generally positive stories that even a quick study of it reveal.

I’ve also looked at some other ways the data could be weighted to achieve a better result and pointed our some of the problems this would cause.

• Weighting by consistency

The spreadsheet shows the individual scores for each of the six activities. As WBC scored between eighth and 80th in these (out of 304), at first glance it seemed odd that it could have achieved the a high position of eighth overall when all but one of its individual scores were lower than this. The reason is that the overall score, and so the derived final ranking, were an average of these six. Some councils scored higher on some measures but were awful on others, or scored in the top grades but less well. WBC had three dark green (the best) and three light green (the next best) in its rankings. This shows that WBC was not only consistent but consistently good.

This is in many ways a more positive conclusion even than the overall ranking of eighth as it doesn’t point to any areas of major under-performance in these six areas which the averaging process might conceal. One might infer, or certainly at least hope, that this could be extended to other areas of the council’s activities. However, there’s no evidence for this either way as the report did not concern itself with these. Nor, surprisingly, did WBC’s administration come to this conclusion for itself.

• Weighting by revenue source

The council tax rate has a large impact on these rankings. WBC’s band D rate in 2021-22 was £1,966, higher than any of the others in the top ten and putting it about in the middle of the range of all those in England.

However, the report ignores the fact that councils derive a different percentage of their revenue from council tax (which is all its report was considering). The table in this SW Londoner article from April 2022 shows that Wandsworth (which comes top in the ranking) and many other London boroughs derive only perhaps 20% of their revenue from council tax. In WBC’s case, it’s 61% (with another 10% coming from the adult social care precept). Indeed, WBC is at the higher end of the national table for the percentage of revenue raised from this source.

The problem with this is that the services councils provide are provided from all revenue, some of which, like charges for parking permits, will also fall on residents. The pros and cons of the various funding methods would be a separate conversation: however, in this case, it follows that such a survey is only useful if the council-tax weighting is itself weighted to reflect how much revenue comes from this. At present, it isn’t. To do so would be very complex. However, not to do so is slightly like comparing the taxation burden in various countries but only by looking at the direct tax and ignoring the indirect.

This therefore makes the respective final rankings open to doubt: particularly that of Wandsworth, a borough with a very low council tax. It could be worse – were this report to have been produced in the early 1990s, Wandsworth then charged no council tax at all (I was living there then). Its score by this measure would therefore then have been infinity.

• Weighting by expenditure

Another way of weighting the scores would be to look not at how councils raised money but what they spent.

An immediate problem is that not all councils are the same size (Kent has nearly 1.6m residents and the Isles of Scilly only about 2,200) so you’d then have to weigh it again by population. Harder to deal with would be the fact that demographics vary, so placing varying burdens on services. The average age in North Norfolk, for instance, is 54 (it also has a third of its population aged 65+) whereas Oxford’s average age is 29. You’d therefore expect NN to spend more on adult social care. How much more, though? Twice? 54 divided by 29? Some other number?

• Weighting by service importance

In the same way as weighting the results by the percentage of revenue derived from council tax, it would also make sense to weigh the six categories by how important they are The easiest way is by expenditure. In WBC’s case, 5% goes on waste, 6% on education and 50% on social care. These proportions will vary (though perhaps not massively) from district to district, often with good reason.

However, all are given exactly the same weight in the calculation of the final score. Is this fair given the widely different sums spent on them? WBC comes eighth in social care (here, only three London Boroughs feature in the top 30), which is very good. Were social care to be given a slightly higher weighting – as with the expenditure, the problem would be working our how much higher – WBC would thus have received a higher overall score.

• Weighting by urbanisation

Council tax rates also seem to have a lot to do how densely populated the districts are, with urbanised ones tending to charge less. Of the 40-odd councils charging less than £1,800pa, the vast majority are in built-up areas. Even more striking, of the 16 councils with the lowest rates, only two (Windsor and Maidenhead and the Isles of Scilly) are not in Greater London. If one removed these from the calculations (rather a crude way of dealing with the problem, I admit), WBC would be second in the overall weighted table.

• Weighting by completeness

With the exception of Windsor and Maidenhead, WBC was the only council for which data was available in all six categories, all the others missing one. It would therefore follow that any conclusions derived from these two councils can be regarded as 16.66% more reliable than those from any of the others.

• Weighting by something else…

…but here, we probably need to stop. By now, we’re at the point when the amount of research needed across all 300-odd councils is becoming unfeasibly complex. Much the same could be said of the spreadsheet needed to calculate it.

In conclusion

As the above shows, there are many ways any such data can be sliced and also many other factors that can be applied to make the results more useful, though many might be disproportionately difficult. What else does all this suggest?

  • We are dealing here with what was actually published – the highlights of the rankings weighted for council tax – and not any intermediate data. The report had a very specific objective, clearly stated in the title, and any attempt to infer something different is dangerous.
  • If a report ranking councils by overall performance were intended then it would be titled as such and, to be meaningful, would look at more than just six factors. I would suggest that this would also be impossible as many would be unmeasurable or have elements of subjectivity.
  • Another way of assessing whether’s intermediate figures provide a reliable guide to overall excellence, as the WBC Conservative’s manifesto claims, is to see how other councils do by the same measure. Three that stand out as bad examples are Slough, Thurrock and, in particular, Croydon. All have deep-rooted problems, have issued Section 114 notices (effectively, admissions of bankruptcy) and have collective debts that run into billions. One would thus expect that – were intermediate and unweighted figures from’s report to provide a sound guide of overall municipal performance – these would be in the drop zone. Not a bit of it. Out of the 304 councils, Thurrock is at 226, Slough at 197 and Croydon an almost unbelievable 125 – so by this measure, nearly six out of ten councils are regarded as being worse than Croydon. Any report of overall excellence (which was never’s intention, though some suggest otherwise) loses much credibility in the face of rankings like these.
  • Of the six factors considered, four – crime response, fire response, care home ratings and school Ofsted reports – cover areas for which the council has what might be termed a regulatory or supervisory role and where the the bulk of the decision-making and day-to-day work is done respectively by the police, the fire service, the care-home managers and the schools; and by the various regional boards or bodies which manage them, some of which extend across several districts. With the last two, one also has to consider how many are run by the council and how many privately by academies. Only roads and recycling are wholly in a council’s control and even these might be operated by others through long-term contracts which the council has less power to amend than it might wish. To a large extent, this report thus tells us more about the effectiveness of the local fire service, the local academies and all the rest of them than it does about the council as such.
  • If the intermediate figures are to be used to claim overall excellence (as they have been) then they needs to be qualified (which they haven’t). To plug this gap, here’s mine: “Using an intermediate ranking – created to examine a separate issue and which considered just six very specific areas of activity – WBC came top. However, this ranking was not published, took no account of any other factors and was not weighted to reflect the level of council tax.” I’d accept that.
  • I’ve suggested that weighting merely by council tax is a bit crude. I’ve proposed others. However, the worst method of all is to apply no weighting at all.
  • Finally, I’m unsure how many of the claims in the report are wholly the result of the political complexion of the council. All are statutory and, as mentioned above, in many cases partly devolved. If there is credit to be dished out, surely it is the officers who should receive a fair share of it? I’ve had many dealings with them in the last four years, almost without exception positive. I’m sure that Lynne Doherty and her team would agree that praise is due here. Making a big deal of this report in an election manifesto doesn’t really give this impression.

WBC does well from the report and, in many ways, even better as a result of the points I’ve made above. It is not the “best overall” council, however. did not suggest this. Even if I saw a ranking for “best overall” council, I would be deeply suspicious of whether every factor had been taken into account and correctly measured or weighted.

Above all, most of this work was done by the officers. They will remain in post after the 4 May elections. Some of the current councillors will not. That will be your time to judge such claims of performance. May the fourth be with you…

Brian Quinn


Leave a Reply

Your email address will not be published. Required fields are marked *

Sign up to the free weekly

Penny Post


For: local positive news, events, jobs, recipes, special offers, recommendations & more.

Covering: Newbury, Thatcham, Hungerford, Marlborough, Wantage, Lambourn, Compton, Swindon & Theale